Episode 37 AI Full Transcript

Cybersecurity and AI | What the Experts Are Worried About, and What They're Doing

Francie Dudrey, Mike Pedrick, Chris Roberts  ·  September 3, 2024

Back to Episode
◆ ◆ ◆
SpeakersJoe Patti — HostAdam Roth — HostFrancie Dudrey — GuestMike Pedrick — GuestChris Roberts — GuestUNKNOWN — Guest
Joe Patti00:05

Welcome to the Security Cocktail Hour. I'm Joe Patti.

Adam Roth00:08

I'm Adam Roth, and this is Deadpool Bear.

Joe Patti00:11

Yes, your favorite. So we got to get right into it. We have a lot of guests. We have an actual panel today. We have Francie Dudry. Hi, Francie.

Francie Dudrey00:22

You nailed my last name. I did? No, but everyone calls me Francie. You said it wrong.

Joe Patti00:26

It was Taco Doll Dudry. I was just using the force. You know, it's good karma, that kind of thing. We have Mike Pedrick. Hey, Mike. Great. And returning to the show, one of our favorite people, the legendary Chris Roberts. Hey Chris.

Mike Pedrick00:45

It's good to have him back. Thank you guys. I appreciate it.

Joe Patti00:49

Yeah. And I got to say, Chris, when we did the prep, we do a prep call before this show. And after we did the prep call, Chris, did he go out and hurt yourself on your mountain bike or something? Oh, yeah. So be careful, man. Please. We don't want that on our heads.

Mike Pedrick01:07

I'm still banned from going off-road, so I just got back from a bike ride where I'm doing like gravel and trails. I still mind my stuff, I wouldn't fall off this stupid thing though, but it is what it is.

Joe Patti01:17

Okay, well I hope you're okay. Awesome. Alright, and before I forget, like I always do, If you're listening on YouTube, please like, subscribe, and comment. Please follow us on Spotify. Give us a comment. Give us hate mail. I love hate mail. Anything.

Francie Dudrey01:36

Did I just completely lose my street cred doing that? Probably. No, I don't think so.

Joe Patti01:43

It's fine. My kids will tell us if it's cringe. It's cool. Don't worry.

Adam Roth01:49

Wait, Joe, is this the game show edition?

Joe Patti01:55

No, this is not the game show edition. Not, not yet. Oh. I'm not, I'm not wearing like a, like a player jacket, you know, we're going to have to like get dressed up for the game show edition or something.

Adam Roth02:05

Survey says, what can you do with a pickle?

Francie Dudrey02:08

Starts moving. I don't know. Oh no, let's not talk about moving. There it is.

Joe Patti02:18

So we do not have a themed drink like we usually do because we could barely get everyone together at once, much less agree on a drink, I think. So I got a white wine because it was handy and it's 100 degrees out. What have we all got? Ice coffee. Ice coffee, okay.

Chris Roberts02:46

Clear liquid in a mason jar. Clear liquid in a mason jar. We need to get that tested.

Francie Dudrey02:54

So my drink, I don't know if you can read that, but it says N.A. Dirty Martini, because I don't drink alcohol anymore because I'm bat shit crazy. So, um, but I have my mocktail that my husband made me. It is a grape instead of an olive and it's like Capri Sun and lime juice. So I'm going boozy, folks.

Joe Patti03:14

You know, even if you don't drink, I think you can still have an olive. That should be okay. But I'm sorry to you.

Francie Dudrey03:19

No, I can have an olive. We just don't have olives in the house.

Joe Patti03:22

Oh, all right. Fair enough.

Adam Roth03:24

By the way, nice shirt. I must tell you that's right.

Francie Dudrey03:27

Excellent I mean, it's got a deep V which I appreciate It's breathable. It's soft and And it's on sale for $99.99 Drop a comment and we'll send you a shirt.

Adam Roth03:49

No, no, no, that's expensive.

Joe Patti03:51

I didn't say it was free All right, Chris, what do you got there? You on the wagon these days or what?

Mike Pedrick03:59

Yeah, this is tonic and lime, and so no gin. Same thing, taking a break from the alcohol side of the world. I've done that for a couple of months. Good for you. Yeah, it's a bunch of different reasons. Yeah, I had to take a break.

Francie Dudrey04:15

I like to say that I have an allergy to alcohol. I break out in handcuffs. Break out in handcuffs. We just don't do that anymore. Oh my neck of the woods So my password lots of stories.

Joe Patti04:33

Oh god. I'm gonna look only one drink. It's a little lush.

Adam Roth04:35

Oh, I'll choose everyone Wait, so this is a security mocktail hour Yeah, possibly.

Mike Pedrick04:40

I'm gonna call it back. There we go. I got a table full of alcohol sitting behind me that I've still got a catalog and get the heck out of here, but Yeah, it is what it is

Joe Patti04:51

All right, so we're going to have a wide-ranging discussion, I'm sure, on just about anything we want. The theme was going to be AI, so we can talk about AI or whatever else you want. We put together some extremely insightful questions, and I swear I didn't use chat GPT for it. But I would not subject you to that, no. So here's the question. Here's the first one. Everyone is talking AI. AI, AI, AI. Survey says AI. You need to have an AI strategy. Companies on their earning calls and they say, oh, we have AI and their stock price goes up 10% for no other reason. So everyone's talking about it. Investors want to see it. But here's the thing. Who's actually asking for it, especially in security? At least this generative AI stuff. Is anyone actually asking for it? And Francie, since you are the only lady present and you are actually in marketing, I thought we would start with you.

Francie Dudrey05:55

So there's a ton of conversation in cybersecurity around AI, and it's wide ranging. I've talked to Mike and Chris about AI, the benefits of it, the ethical implications of it. I think security people are the most skeptical people I've ever met in my entire life. I said, Mike lives in a paranoid state. about 95% of the time. And as well, he should, as well, he should, because there are there are there are implications there. I think I think there are some exciting ways that you can leverage AI and cybersecurity to help because we all know about the cybersecurity talents shortage and You know, there's just so much that it can do, but you have to do it the right way. You can't just be pulling pulling AI from like, you know, open source data and stuff like that. You need to make sure that it's kind of locked in. We're actually looking at it at my company and how we can. do stuff that leverage AI in a way that's ethical, that's safe, but also can do some pretty powerful stuff. So I think people are interested in it. I think they're just kind of nervous about how they would actually apply it. I hear a lot of companies talking about AI. I think it's, I mean, I'm a marketing person and I know why they're talking about it. Because SEO, everyone's talking about it, but they don't get into what it actually looks like. And I think that's the issue is like, What are you actually doing with AI? And Mike is the one who told me some stories about how people have gotten in trouble leveraging AI because they just didn't know how to use it. Yeah.

Chris Roberts07:39

I think you need it, though. Just to expand on that point super quick, we've been asked, all of us as practitioners always get asked, what's going to be the big thing next year? Or what are your predictions for the following year? And for probably two years running, I've said, hey, it's all well and good that we've got this new toy. We've got this big thing that uh, has captured the attention of folks inside and outside of our industry. But the problem is that it's not, the tool itself isn't the problem. The people are the problem, right? Yeah. So we had that case. The first one that I think really broke the membrane of the social consciousness was that lawyer who, Oh yeah.

Francie Dudrey08:23

It's my favorite story.

Chris Roberts08:24

Straight up. run-of-the-mill personal injury case, right? Somebody on an airplane gets bumped by the drink cart and files a suit against the airport or airline, and this attorney is responsible for pulling together the brief. Here are, you know, cases that establish precedent, etc. This jackhole feeds details into ChatGPT. gets output and passes that on down the line. And somebody, some paralegal, probably somebody, you know, some intern potentially starts looking at it and says, these cases don't exist. Like these cases are fabricated from whole cloth. And Fast forward to the end of this whole story, guy's been disbarred. He's no longer a lawyer.

Joe Patti09:14

Was he actually disbarred for that? I didn't realize that.

Chris Roberts09:16

I heard, yeah. Wow. This is my point, right, is that a lot of folks will say, dear ChatGPT, dear Copilot, dear whatever, right? Here's my question. Here's my prompt. And footnote or side note here that the number of people who think that they are, you know, David Carradine, guru, master, you know, Kung Fu, my prompts are the best. Get out of town. Your prompt is exactly the same as everybody else's, right? But anyway. Not reviewing the content is a sin. Not reviewing the output is just short-sighted and getting people in trouble. The case, who was the rapper? There was a famous rapper whose attorney, this was a legitimate case.

Francie Dudrey10:01

Is his wife left? Maybe.

Chris Roberts10:05

No, I don't think it was Wyclef. It was somebody that I, who, I don't know that, well, I don't know any of them, but you know, I don't know the, I didn't recognize the name, but he's, his attorney pulled together a closing statement for the jury using generative AI. And it was disjointed. Sentences didn't make sense together. It was filled with, you know, just, it just wasn't right. And so, um, could potentially have cost this cat the case, right? And so when you think about the ramifications of being lazy with this tool, this very, very powerful tool, the ramifications not just to you, but to others are massive. And it's, you know, Our shortsightedness as a race, as a collective whole, has gotten us into trouble in the past. And I think this is another one of those cases where AI is fine. There's nothing wrong with AI or the advent of AI. I'm not worried about Skynet. I'm worried about people doing themselves harm as a result of AI.

Adam Roth11:12

So I think the word this week is hallucination.

Francie Dudrey11:16

Oh, hallucinations.

Adam Roth11:18

I hate that word. And I also, you know, Mike, I think you're 100% correct, right? When you use anything, any content, it behooves you, not only from an integrity standpoint, but from multiple standpoints to check the authenticity of what is being presented to you. And that's an issue.

Chris Roberts11:45

Just super quick. We have a whole group of folks, right? Otherwise intelligent folks. I don't mean to make that sound as derogatory as it did as soon as I gave it oxygen. But we have very intelligent folks who are going to ask AI about the authenticity of the output of AI, right? It's an Ouroboros. It's a snake eating its own tail. This is going to spiral. It's going to continue to spiral.

Mike Pedrick12:09

So by the time this comes out, I will be in a role that will actually be announced by the time we're recording it in about three days time. And that role will be doing a whole bunch of work with deepfake AI. Wow, cool.

Chris Roberts12:30

Wait, wait, wait. No, we need to hear more. Breaking news, breaking news, breaking news. You're going to be stopping them or making them?

Mike Pedrick12:42

So this is where life gets really interesting because like anything it's an arms race and if you look at AI, AI is very good at taking a large volume of data and to some degree making some sense of it. Now, if you look at an AI system, it can take me, turn me into Mike, turn me into Franzi, Joe, or Adam yourself, or anybody else who wanted to. And that is where, to me, a lot of the power is coming in, which is in identity. So if you start taking a look at AI and its uses in identity, specifically in the fraudulent side of identity, I think the highest case currently out there is a single instance of 25 million getting taken because an AI architecture was able to be used to basically bypass a multi-factor system that was using facial and other recognition technologies and initiate a transfer. So kind of like to Mike's point, what you're now seeing is you're seeing an adversary using an intelligence technology to make another me. And then on the defensive side, you're going to have to have, and you start to see it, but not as much as I think we'll see in the next couple of years. You'll see an AI that says, hmm, how real is this? What algorithms am I going to use? What data points am I going to use? Where was this taken? Where is this from? What's the background? What's the frame rate? What's the refresh? And that list of other things. is where intelligence and AI is going to become more useful. Touch wood is, let's face it, 40 years of running technology, we still don't know whose hands are on the keyboard doing and what reasons and why and wherefore. So for me, I want to see it, especially the deepfake side, to be able to identify the right person, right place, right motives.

Adam Roth14:29

So we're going to have to hash your face.

Mike Pedrick14:32

Yeah. And worse, actually, unfortunately, because even hashing my face isn't going to do it. That's where it gets interesting. Plus the hash is still interceptable at an AI level, because what's happening is I'm building an entire model of me off of all the stuff that's out there in the public. But there's still some challenges with that, which is why we're going to see this race very, very soon. And that's part of the reason I'm actually looking forward to starting the new role on Monday, which is going to be amazing.

Joe Patti15:03

Oh, you're fast.

Mike Pedrick15:04

Oh, I'm geeking out. I mean, they basically made the role for me.

Adam Roth15:09

Hey, Chris, you ever see the movie Red Notice?

Mike Pedrick15:11

Yeah. Yes.

Adam Roth15:13

Yes. Yes. Yes.

Mike Pedrick15:14

Yes. Exactly.

Adam Roth15:18

Ryan Reynolds, man. That's the man. He did that sneak with that. Yeah, that was amazing. And that's what they're talking about now. Actually, I think it was even on LinkedIn. There was a Chinese guy, a Chinese grandfather and a young Chinese person because it was all done in China. And they created a dead person with a live kid. and put them together in what appeared to be real animation and meanwhile the grandfather's been dead for many years. So how do you know?

Mike Pedrick15:50

So we're seeing that a lot. You're seeing it in movies, you're seeing it in other things. You're actually seeing it in the animation and some of those folks are currently on strike because they're seeing AI take over some of their roles and some of the voice stuff. I did a talk fairly recently and you need basically, I want to say it's like 10 seconds, 10 or 15 seconds worth of somebody's voice and somebody's video to be able to replicate them effectively enough to pass most systems. You want to do it properly, you just throw more processing at it. It's as simple as that.

Francie Dudrey16:23

Fun little fact, this platform we're on, They now have the capability, like if you're editing a video and you're like, oh, I didn't really want him to say that. You can change it and it looks like he said exactly what you typed in.

Joe Patti16:41

Oh, really? We're going to have some fun with that.

Francie Dudrey16:46

Thanks, Francie. But yeah, it's a beta. So I don't know if you guys have it yet, but I have the beta. But it's nice, because it's like when people, you know, you're like, oh, man, I wish they would have said something like that. You can just do it now. But there are ethical concerns. You have to do it.

Joe Patti17:06

That's completely unethical. I mean, I'm even nervous doing a show editing people too much, but changing what they're saying?

Adam Roth17:13

Joe, stop, stop, stop. Joe, what we're going to do is we're going to make Chris say, if you don't sign up for our podcast, he's going to tase you.

Mike Pedrick17:22

If you ask him, he'll just say it. I mean, happy to make that up.

Adam Roth17:27

Hey, Chris, can you say that?

Mike Pedrick17:28

Sign up, sign up, or I'm going to find you, kick your door in, and tase you. Thank you.

Joe Patti17:35

Awesome. So you don't need to deep fake Chris for that. There you go.

Chris Roberts17:40

I'm going to make that a ringtone though.

Joe Patti17:43

Well, I've been wanting to do a show just for a gag when I get around to it. It's kind of a fun thing. I thought we would do a show where it's just Adam and me and we talk and we like switch each other's voices just to be a wise guy.

Mike Pedrick17:56

Oh, there's a really good tool. Remind me afterwards. I'll send it to you. I've been using it. I'm actually building a presentation at the moment. And again, for the new place. I'm building a presentation at the moment that starts off as me. And it morphs, and it morphs into several different figures and several different people, each with their own tone.

Francie Dudrey18:17

Can you be me?

Mike Pedrick18:19

Here's the reason I'm doing it. If you think about it, from an attack perspective, I can be authoritarian, I can be all these things, but there are certain things I cannot be as me as an adversary. But if I mess around with me, and I turn myself into pulling on some of your other emotive strings, and as an attacker I go after, maybe I go after sympathy, or I go after empathy, or I go after these other different emotive strings, I can use this, and I'm using this single system to actually build versions of me, and then from that stuff I do an attack profile on you, I see what will actually work with you, and I can just basically multitask with that thing. It's scary, and it isn't that hard.

Chris Roberts19:00

Have you guys, have you heard, the story made the rounds on Saturday, but Ferrari, the, an executive at Ferrari. Oh yeah, yeah, yeah. CEO.

Francie Dudrey19:11

No, I didn't hear it. What happened?

Chris Roberts19:14

This was a, it was a deep faked voice message through WhatsApp. CEO, quote unquote CEO, we know it wasn't the CEO, sent a message to, um, Like the operations guy I think it was yeah something something like that like here's an NDA that's coming be ready to sign it something along to that to that effect and the whatever executive that received the message was a little bit suspicious and said, you know, OK, can you can you remind me what was the book you recommended I read recently? Right. And the the other end didn't know. No clue. Didn't have any idea about this thing. But there's the voice was. you know, brutally on the money. Like this was the CEO's voice clearly. Right. So it's not like the text message that says, I need you to pop down to the store and buy, you know, five iTunes gift cards. This was actually the CEO's voice. Um, but a simple question, what was the book you recently recommended? I read a short circuited that whole process. I can't even imagine how much money they would have gotten out of a company like, like Ferrari, if that attack had been successful.

Adam Roth20:22

So, yeah. So Mike, my wife gets mad at me. I do that challenge questions all the time. I'm like, I want to make sure it's you. What was the last nasty thing you said to me? She won't be able to keep track.

Joe Patti20:36

She cycles through a bunch.

Francie Dudrey20:41

That's just something that we talk about. It's interesting. We just did a webinar. our Threat Report webinar, and one of the most requested things that people want to learn more about is cybersecurity awareness. And I'm like, really? But it's true, because this stuff changes all the time. It's like, you know, if you get a text from your CEO asking you to buy like five gift cards, yeah, that's BS. But like, if somebody actually calls you, sounds like the person, like, That's crazy. And I don't think a lot of people realize that that can happen.

Joe Patti21:14

I think they're starting to see it though. And this is, I mean, this is like mission impossible shit. This is like crazy that the real time that they can do with it is stuff.

Mike Pedrick21:23

Here's the challenge. Here's the challenge on that one. I mean, Joe, to your point, yes. Some people, I think they're becoming more aware of it. Partly because it's in the news, partly because we're seeing it in movies, etc, etc, etc. I mean, Bruce Willis was basically re-imaged to use in future movies, given his current mental situation or health situation, whichever way you want to look at it. But the challenge is... so much of a percentage of the population, the vulnerable, the older ones, the younger ones, and a bunch of the population, we still haven't gotten them to challenge a freaking password response or a, hey, UPS is sending you a thing, please send money. They're still getting taken in by the flipping, the scans that the prince is sitting on, 55 million, and would you like some of it? This takes it to a whole other level, and that's the problem. We're not ready for it.

Adam Roth22:15

There's two things I want to bring up about that. One, here's the irony, right? We're all in this field, some more than others, some with specialties in cybersecurity, but any single one of us, as knowledgeable as we are, are still vulnerable. It's kind of like the couple of times I've heard about somebody that was a parachutist, or they parachuted, and it was so comfortable parachuting, They ran out of the plane forgetting to put their parachute on. And that's true. There are people that are so comfortable they have actually, unfortunately, met their demise because they didn't check that they put their parachute on. And we're capable of that. Sometimes we put ourselves in such a position that the confidence is there, but it shouldn't be. We should always be listed as self as vulnerable. And another thing I wanted to talk about AI before I forget is AI comes in different different forms for us. AI is sometimes embedded into a website to help us make decisions. AI is sometimes embedded into a product in order to make that product better. And then the other thing that we were talking about before about issues with AI, one of the biggest things that's come out recently, when I say recent in the last year or two, is that AI can be considered bias. And that's another whole thing in itself. Their responses can be considered biased towards certain religions, certain races, certain political affiliations.

Joe Patti23:51

Well, that's true. And I'll tell you that with some of the biased stuff, I mean, it's bad enough that there's the inherent stuff because of the technology. It's like I say, I hate the word hallucination because it's just a Silicon Valley euphemism for it's wrong. We're also seeing stuff where...

Francie Dudrey24:09

I love it.

Chris Roberts24:10

Thank you for saying that out loud. More people need to hear that.

Joe Patti24:13

Absolutely. That's it. That's just wrong. Well, I wouldn't pay money. I don't pay money for it, but it's just wrong. But also, as if that wasn't bad enough, we have some pretty good indications that companies are also deliberately baking even more bias into it. And that worries me too. I mean, like, you know, like some of the Google stuff is like, I just can't believe that that's an accident. Some of it's so outrageous.

Francie Dudrey24:43

When you type in a search and you're like, yeah, really? That's what I was searching for?

Chris Roberts24:48

I want to say something potentially incendiary and I want to be judicious about how I say it. We need views. Not take inbound, right? The bias and beauty have something in common. They're in the eye of the beholder, right? It is not AI's fault. that there may be bias, right? Because you assemble five people reviewing the same data, a portion of that five are going to see bias and a portion are not, right? Whether the person who doesn't see bias is ignorant to the larger picture or the inverse, the people that do see bias are ignorant to the larger picture, whatever the case may be, right? Adam, you look like you wanna jump in there, go for it.

Adam Roth25:33

Wait, but keep in mind, you're 100% right, but wait, there's more. It depends on who your sample set is.

Chris Roberts25:41

Right. Right. And that's, so I feel like the knee jerk reactions to this, this code base or this thing is, is broken because we see bias in it. That knee jerk reaction is more dangerous than the bias itself. And so, because we're, we're, we're breaking the thing like, Oh no, it said a thing that, or it outputs something that we consider to be taboo. I got to go back in and re rework the whole thing. Well, Maybe but let's get to the source. Let's let's evaluate. Why did it come out with that thing now, please understand at no point in time? Well, I ever advocate for what was the story in 2016 that they released an AI chat bot that became you know, oh Yeah, right Here's a really super good idea, right? Don't ever give a chatbot to like 4chan or Reddit. Just don't.

Joe Patti26:40

Not even for entertainment purposes.

Francie Dudrey26:42

It's not even good.

Chris Roberts26:44

If you wouldn't give them an infant, don't give them a chatbot.

Adam Roth26:47

I don't want you guys to judge me, but I've been using AI for about 41 years. No, I've been using Eliza. Should I stop using Eliza? Do you remember Eliza?

Francie Dudrey27:01

Do I know what Eliza is?

Joe Patti27:03

I thought I was your therapist, not my AI.

Adam Roth27:06

Do any of you remember Eliza?

Joe Patti27:09

Yeah.

Chris Roberts27:10

Don't help me out with this.

Joe Patti27:11

Oh, you don't?

Chris Roberts27:12

Did I miss something big?

Adam Roth27:14

Hold on.

Joe Patti27:14

Chris, you know Eliza. That's an old school thing, right?

Adam Roth27:17

That chatbot has been around for over 40 years.

Joe Patti27:21

A long, long time.

Adam Roth27:24

AI is not new! AI is not new!

Joe Patti27:35

early chatbot that was like a, kind of like a psychologist. When you told it something, it would ask you things like, tell me about your feelings or tell me more. It wasn't very sophisticated, but it seemed real to people.

Mike Pedrick27:46

We used that engine. Side story on this one. Many, many years ago, an individual who did some good books that we're well aware of, including Restaurants at the End of the Universe and Don't Panic, I got dragged in to help out on Starship Titanic. And we actually used a version of Eliza, we actually forked off a version of that to use some of the feedback loop for the conversational side of it. So, remember the early version of Hitchhikers, the early game that... Oh God, Hitchhikers Guide to the Galaxy thing?

Adam Roth28:20

Or no, is that what it was? Yeah.

Mike Pedrick28:22

Yeah, that was that. So this was actually not Hitchhikers. We actually did another one with Douglas, which was called Starship Titanic. If you have a look at it, it's another, it's another one. I, so I got some of the code behind the scenes on that, which was actually loosely based off of some of the Eliza stuff. Cause we were looking for a feedback loops and comments and all sorts of interesting stuff. Yeah, that was, yeah, that's the same, similar thing. That's how far back I go with this kind of crazy shit.

Adam Roth28:45

That same year I was using MUDs and I was using IRC or Merc.

Mike Pedrick28:55

I still use IRC.

Adam Roth28:57

What did you say?

Mike Pedrick28:58

I still use IRC channels.

Adam Roth29:01

Oh man, I actually used it a couple of years ago, but I was like, man, why is the port 6666? No comment. You know, if you do the OSCP, which I was doing, the OSCP still uses IRC too.

Mike Pedrick29:16

So here's the interesting thing on this one, when we talk about data, which let's face it, our world is coming more and more to data. One of the things, again, this other role that's gonna be part of this one, and something I've done before with a company that's already out there, is adversarial AI. Logic is simple, is you get it to eat itself. So, perfect example in this one, couple years ago, and actually fairly recently, again, got dragged in to deal with some folks out on the East Coast, who were like, hey, this vendor's coming in, and they got this new toy, and it's got AI, and it'll protect us from everything. And we're like, playtime. And so we ended up again training their AI our way, feeding it all sorts of interesting data, bypassing its checks and balances algorithm slowly but surely to the point where it basically removed its own systems off of the network because they were actually a danger to itself.

Joe Patti30:16

Well, here's the interesting thing that comes up with that. The same thing as bias being in the eye of the beholder. How do you define what's dangerous to itself? That's a devious mind.

Mike Pedrick30:31

Who's watching the watchers? If you start taking a look at identity, think about it. Years and years and years ago, identity was user ID and password. 3-4 characters, if you're lucky, have a nice day and off we go. Then we started putting more and more layers. I'm actually going to do a LinkedIn post on this one at some point in time. Because then it was like, alright, now we need multifactor, now we need this, now we need this, now we have PAM, now we have IAM, now we have user identity, and we have all this other stuff. What we're now going to have to build in is, it's literally a watcher of the watchers. that's watching to see the consistency across systems and then do some variations on that to see am I being poisoned over time, am I being poised over, you know, all these other things. That layer of authentication to see who is actually coming into that system, that layer of the cake is just going to explode over the next couple of years.

Chris Roberts31:23

We're already seeing some of that, right? So, you know, the IAAA process, right? Identification, authentication, authorization, accounting, right? Yeah. We're seeing, it used to be, and for the record, just so Joe, Adam, you guys know, I spent a lot of my time teaching. So if something comes out of my mouth, it sounds like it came from a textbook. High likelihood it did. Anyway, it used to be something you are, something you have, something you know, right? Like that's so 1993, like we're past this, right? So now we're looking at behavioral analytics. Is the way that you're authenticating consistent with what we're expecting? If it's not, suspicion, right? Maybe we let you in anyway, but we're going to elevate that scrutiny. We're going to be watching you more closely, right? So that behavioral analytics piece is like, we're already there. That's already happening. A lot of folks don't realize it. And a lot of organizations can't afford the tooling to permit that yet, right? But the reality is there. We're already there. You're signing in from a place that you haven't before. You're signing in from a a device you haven't before, this is academic, this is table stakes. But now it's what is the behavior that we see once you're in or as you're signing in, is it consistent with what we expect?

Adam Roth32:48

So one of the episodes that we did over a year ago was Passwords Must Die. And I think we're kind of segueing.

Joe Patti32:54

Episode one, the first thing I wanted to say, Passwords Must Die.

Adam Roth32:58

So I think we're segueing into what do we think about whether it's AI or not, What's going to be the future of, can we do this show? Can we segue into this? What's going to be the future of being able to be authenticated? Will passwords go away? Will it be AI? Will it be some kind of identity?

Joe Patti33:20

You know, I think that's a great question because it kind of gets to, how do we authenticate people when you have this situation where, you know, to use an old thing, it can literally trick your mother, you know?

Francie Dudrey33:33

Right.

UNKNOWN33:34

Right.

Joe Patti33:36

Okay, so then that is the security step identified.

Mike Pedrick33:40

I mean, if you think about it, I mean, Adam, to your point, I mean, you can't make it more complicated. That's just not going to work because people will find ways around it. I mean, we're humans. When we said use your phone to authenticate with, that became a challenge and people find ways around it because they don't have their phone to hand, et cetera, et cetera, et cetera. The tokens came and went to some degree, the soft tokens, because you kept losing the stupid things. Think about it, when Mac was the one that mainstreamed the fingerprint stuff, and then they suddenly realized that was going to go away, then all of a sudden the biometric surfacial stuff, we realized we could bypass that with pictures.

Adam Roth34:20

And we have problems with COVID, Chris, forget about COVID.

Mike Pedrick34:24

Yeah, good point. And I think we have to get to the point where it has to be seamless. And again, this is the nice thing about going off to play with these new folks. It's got to be as seamless as a human impossibly can be. But it's going to have to be so, to Mike's point, relatively convoluted behind the scenes to go, is it the right person in the right place at the right time? You know, if Mike logs in today from here, tomorrow, like Mike logs in from, let's say Mike logs in from India. Can that authentication system back out of where it is, do a quick checks and balances with Amex and see that Mike actually is now somewhere else? Or can it go into Mike's social profile and go, oh yeah, Mike's been posting pictures from India. Can it do enough outside, basically it's building its own threat profile on Mike in a near real-time basis to go 90, 95% probability, and then we're gonna have to turn around and go, well, are we accepting on 90, 95, 92, 80?

Adam Roth35:22

That revolves around all these different technologies. Is SIM going away? Because SIM is a lot of what profiles that stuff too. What do you do with the data? Yeah.

Chris Roberts35:33

So the one thing, I wanted to jump in here, and Francie sees this coming already. So, privacy. I don't want an authentication system knowing that I'm vacationing in India. It's not their business. That's just not a part of the picture.

Joe Patti35:53

But it's going to have to be.

Chris Roberts35:55

Well, if I have to travel with a friggin Motorola Razr, I will do so. Right. That's that's just that's it's this is part of it. But I hate I say that and then I hasten to add this right now. Give you guys an anecdote. And I'm going to jam Robin Hood under the bus with this. Right. So I bought, or I signed up for Robinhood. I got a free share of Chesapeake stock. And this was just before Aubrey McClendon, the former CEO, ran his Tahoe into a bridge abutment at 40 miles an hour accidentally to avoid indictment. Perfect. Yeah. That event led to my stock in Chesapeake sort of erasing. Like Chesapeake doesn't exist, and so that free stock doesn't exist. Therefore, I have no money with Robinhood. Okay, when you sign up for Robinhood and you give them a phone number, they implemented multi-factor authentication such that if you sign in from a new device, it sends a code to your phone number. Hey, that's great. I'm a fan of multi-factor authentication, except I don't give out my cell phone number. I give out a landline number that can't receive text messages. So I reached out to Robinhood and I said, hi, I am me. I assure you I am me, but I can't get into my account and I no longer want to receive marketing material from you. I want to delete my account. Kid you not, they said, scan your driver's license and send it to us. Let me make you a list of the things that I'm not going to do. The first five on that list is exactly what you just said. So, Where I'm going with this is that in order to get that authentication, for them to validate that I am who I am, they're asking for things that violate privacy.

Joe Patti37:44

I go even further and say, besides violating your privacy and finding ways to get in the hands of bad people and all the things that happened with that, it's almost when we start basing identification and authentication on not tokens and things that are assigned to you, but things like you, your own attributes. Well, the ability to identify them is also the ability to duplicate, basically. Yes.

Mike Pedrick38:11

And that's why we're not going to get rid of passwords, because guess what? If you lose this stupid thing, just replace it. You lose your fingerprints to an adversary, you'd be screwed.

Chris Roberts38:20

We need people to be smarter with passwords, not to get rid of passwords.

Francie Dudrey38:23

And that's the thing is that, I mean, I can tell you, working on the marketing PR side, we still get requests for password, you know, password instructions on how to keep, you know, how to do the right password and passphrases. And so, like, it's very basic stuff, but they're asking because people still don't know.

Mike Pedrick38:44

Well, not only people don't know, but I mean, you've only got to look at the top 20 that comes out every single flipping year. I mean, good grief alive. The fact that we make passwords complex by adding seven, eight and nine to it. Is not saying too much again. We we know what we're doing and let's just say there's 15 maybe 20% of the population It's like woohoo. We great that leaves an awful lot of vulnerable people who either a don't have something on their phone or don't have a phone or a Their computer is just straight into it because it's easier that way. Or a thousand and one. You got Alzheimer's? You're not going to be remembering that flipping password every five minutes.

Joe Patti39:21

I don't know.

Francie Dudrey39:22

I can't remember stuff.

Joe Patti39:24

Oh, Chris, you're right too. There are a lot of older people. Many of whom I'm related to, who cannot figure out a password manager and probably never will.

Mike Pedrick39:35

And that to me is where privacy, the illusion of privacy, because let's face it, privacy is an illusion. And again, to Mike's point, I'm going to throw a hot mic out as well. Privacy is an illusion. Now, I would argue, by giving up more, you have the potential, if the dangerous looks after properly, you have the potential to gain some of that back. If I can prove who I am, where I am, and what I'm doing, then I can have a little bit of cocoon of safety.

Adam Roth40:04

So, my dad, my dad, like you know how we have like last pass, and one password, and all those different things? Yeah. My dad has his own password manager. It's a book you bought on Amazon that says password. And he puts it in a book.

Joe Patti40:20

I said, oh my god, what are you doing? What a coincidence. My dad too.

Mike Pedrick40:22

But hey, you know what? Nobody online is going to steal it. Nobody online.

Adam Roth40:26

I know that.

Joe Patti40:27

In a sense, it is very secure.

Adam Roth40:28

In a sense, yeah. It's air-gapped. It's air-gapped. But you know, so it's funny. When I used to work for Joe, I worked at a company, and we were dealing with EDR, which you won't even, I don't even want to talk about. So I turned around. I said to the guy in the room, I said, You don't have a tablet or anything? You have that little notebook?" He goes, yeah. I go, he goes, you'll never steal my information from my notebook as long as I have it. And I said, and what did you used to do for a living? He goes, I goes, I worked for the CIA. And I was laughing my ass off. And it was like, it's like, oh my God, he goes, you know, go back to basic stuff. But, um, I was going to say something else. I'm trying to remember what it was, but, um, it's, we're not going, we're not going anywhere away from passwords or authentication. Oh, Mike, if you want to be safe, I mean, I'm Jewish, but there's a lot of religious Jews, they buy kosher phones. And the kosher phones, Google it. Google it. This isn't a pun, this is a thing.

Chris Roberts41:26

I can't trust you, Adam. The kosher phones, and they have

Adam Roth41:34

Limited amount of functionality that prevents people from doing or looking at certain things. It's cool. They don't work on Saturday, right? So I'm gonna Google it kosher phones I So and they think of everything kosher cell browse free cell phones and devices kosher cell org a kosher cell we provide a Most phones have open internet access that can be addictive, time-wasting, and dangerous. At Closure Cell, we provide you and your family with safe, distraction-free phones so you can access useful smart apps without any of the bad stuff. This is not a sponsored video.

Francie Dudrey42:15

Is this like a parental locks kind of situation?

Adam Roth42:18

No, just guide certain people, certain sex, certain religions.

Mike Pedrick42:22

Same thing as the Muslims.

Joe Patti42:25

There's a koshermobile.com, I can't believe that. But I mean, just get a flip phone.

Francie Dudrey42:30

I bet that's the thing. I'm like, why don't you just get like a, you know, where it's like T9 texting and stuff.

Joe Patti42:35

Well, I, yeah, that's good. Is that a, is that a dumb phone approved?

Chris Roberts42:41

They also make phones.

Adam Roth42:50

They also, they have these other phones that are not Google and they're not a Apple, but they're kind of smartphones. It's kind of weird.

Mike Pedrick42:57

I forgot the name of those phones too, but there are phones out there that Will not have these apps, but they glide to do certain other enhanced stuff So I don't know what Medillo does armadillo is another company that doesn't really flipping nice I mean there those beasties are Those are the ones you can take into all sorts of interesting places and not care if it gets lost or stolen or not China or Russia Yeah, no, Armadillo's good. Those ones are, they're nice ones.

Chris Roberts43:23

Here I was thinking that the last generation of dumb phones came from Microsoft.

Adam Roth43:29

Wait, wait, wait. So I should be, I have an IPAQ. IPAQs, remember those? Yes.

UNKNOWN43:37

Oh my gosh.

Adam Roth43:38

Wow. Maybe Joe, you can, yeah, we'll reference Armadillo phones. Maybe we can get a free one. Yeah.

Francie Dudrey43:44

Do you think the younger security pros, like the guys getting into the business, watch this and are like, what the hell are these guys talking about? And are they 75 years old?

Adam Roth43:56

Oh, I'm 80. I'm 80, by the way.

Francie Dudrey43:58

OK. I would be curious.

Joe Patti44:00

Believe it or not, our viewers skew a lot younger than I thought. So they have so many interests.

Francie Dudrey44:06

Oh, my god.

Joe Patti44:07

Oh, yes. There we go.

Francie Dudrey44:09

I had that.

Joe Patti44:10

Wow.

Adam Roth44:11

By the way, that's his tablet. By the way, I was born January 1st 1999. Oh 1900 for me Yeah, I got 1900 anytime I can do 1900 I always do I think on LinkedIn I'm like January 1900 or something I Do you know what's funny? We had an attorney on and she told us, I think she even wrote books, maybe, or read books, but she wrote books about privacy and she goes, yeah, but I still have Google. I think that's what she said, right?

Joe Patti44:59

Yeah, she said, it's really hard to live without this stuff. And like, you know, a person who wrote this book on surveillance stuff and everything, she told us, she's like, you guys got to calm down. Take it easy.

Chris Roberts45:14

Everybody has their own risk calculus, right? There's certain risks we're just going to take. I mean, listen, I am steadfastly committed to privacy advocacy at the local level, not at the y'all must do. This is what I'm going to do. I drive a Tesla, for Christ's sakes, right? Like, I am rolling surveillance. So it's, you know, but my risk mitigation is that the cars and their charger live on their own network segregated from everything else in my life. So you do what you gotta do to, you know.

Mike Pedrick45:52

Didn't they say that about industrial control systems a while back, that they were on their own separate network until, you know, somebody decided to plug them into the corporate net?

Adam Roth46:03

So, you know, we all put our camera covers on our computers. I would say all of us on this call, if I'm wrong, don't have camera covers on our phones. We don't turn off the audio. We don't prevent that. By the way, I haven't gotten a call in 10 years because I put my phone into a Faraday cage. So, I don't know.

Chris Roberts46:26

We don't use our phones for calls anymore anyhow, right? No.

Joe Patti46:31

Seriously who talks to anyone?

Francie Dudrey46:34

My life is so boring so listen away really you're gonna hear about me yelling at my daughters to put their pants on And there goes Mike

Joe Patti46:48

Okay, so here's a question. This is really going to stoke it when we're talking about AI. We're talking about the hot stuff, the cool stuff, everything. Are we actually worrying about AI too much at the expense of all the other things that people are getting killed on? Look at the big breaches we have and the stuff we've had lately.

Adam Roth47:08

It's not being caused by AI. With great power comes great responsibility.

Francie Dudrey47:15

Oh, was that your quote?

Adam Roth47:17

No, that was deep. That was that was what's his name's from Spider-Man. Yeah, I was gonna say, did you have a nephew that got bitten by a spider with some poor kid tried to get himself bitten by spiders so he could turn into Spider-Man recently. So

Francie Dudrey47:35

See, we need to put more money in education in the United States. I'm just saying. Good Lord.

Joe Patti47:41

Well, that's it. It's like a word about all this whiz-bang stuff and people are just getting dumber and dumber. I don't know, it's crazy.

Francie Dudrey47:48

No, I think there's a way, like with AI, yes, there's a ton of buzz around it, but me personally, I leverage it a lot because, you know, small team ideas and things like that. But what Mike said before is like, you got to check your shit. Like, you can't just rely on this stuff to float you. It's not going to happen. You need to learn how to use it effectively alongside what you do. There has to be a human element in everything. So I think there's a lot of hype because, you know, it's a good headline. I don't think that I don't think we need to be freaking out and hyping it up as much as it is right now, because I think it's just a matter of harnessing it in a way that's responsible and using it effectively.

Mike Pedrick48:39

Here's my thought on this one. Yes, I think we need to worry, but not for the reasons that we are worrying. I would look at AI and the compute power it uses as probably one of the worst eco-terrorists out there.

Francie Dudrey48:55

That.

Mike Pedrick48:57

That. It's insane. We are maybe 2, 3, 5% of AI usage would be arguably for maybe the furtherment of the human endeavor and population. The rest of it, I'd argue, is crap.

Joe Patti49:13

And yet we are generating AI port that really contributes to the system.

Francie Dudrey49:19

You know, whatever gets your jollies.

Adam Roth49:22

I think what I was going to say, we're using AI supposedly through different methods and means to communicate or try to communicate to alien races outside of our solar system.

Mike Pedrick49:35

We're never going to have a planet left to communicate. That's the thing. Let's put this on here.

Adam Roth49:41

Well, you don't want to talk about it. Can we communicate with our own people first? I'm trying to, I'm trying to bring aliens into this. If they come to visit me today. No, but.

Joe Patti49:49

The aliens are right outside, Adam.

Chris Roberts49:52

The aliens are going to land and think that we communicate in porn. That's the fast forward.

Francie Dudrey49:57

They're going to land and they're going to get the F out of here because they're like, these guys are no.

Adam Roth50:02

If you have anyone to blame, blame Chris and his pickles. I don't want to hear it.

Francie Dudrey50:07

They're going to see that and be like, no.

Joe Patti50:10

Well, I did read something interesting. I cannot remember who wrote it, but he said something like, if you look at all the power AI is using, they say like, I don't know, GPT-4 or 4.0 or something like that, like it's like as smart as a high school or college student, whatever it is. But it uses like 10,000 times the power, the energy of that human brain. And when you think about it like that, you're like, that's really not very good. You know, that's not so impressive when you put it like that.

Adam Roth50:46

I'm only using 1% of my brain, so we're good. No, but you know, look, look, look. I'm not going there. Let's consider, look at this. Give me a minute to say this. Look at AI like you would look at a weapon. AI should have safety. You should be trained on how to safely use it. A weapon could be used for good. It could be used for bad. It could be used to hurt someone. It could be used to hunt, to bring food into your family. So if you treat AI like a weapon, and you use it responsibly, you should be able to use AI in the correct way. But the question is, where are those boundaries? Where does it end? Where does it begin?

Francie Dudrey51:30

Well, and the education, like learning how to do it. I mean, we've been talking about just, you know, how do we teach people how to use it? Like the prompts and things like that, ways to effectively get what you want to get out of it. And know that you can't just rely on solely AI to do your job for you. The whole idea around like AI is going to steal our jobs only if you don't know how the hell to use AI effectively. Like, you gotta have the human element in there to do it. And yeah, so I think that's right, what you were saying, Adam.

Joe Patti52:05

Yeah, for me, seeing what AI produces, if AI is going to take your job, I'm probably not doing anything too spectacular that way at the current level of it, you know?

Chris Roberts52:17

So all the paranoia around AI is going to eliminate all these white collar jobs, right? Like if you said to somebody 30 years ago, 40 years ago, that it would be legitimate job titles, YouTuber, Eyebrow designer, right? They'd have thought you had three heads. And yet those are job titles today, right? One of them is really lucrative. And I don't mean the YouTuber.

Adam Roth52:41

Eyebrow designer. Oh, is it opening?

Francie Dudrey52:45

Although I will say my kids were like, I want to be a YouTuber when I grow up. And I about, you know, disowned them. It's fine. If they want to, you know.

Adam Roth52:53

And yet here we are.

Francie Dudrey52:55

And here we are. I know. But I'm not unboxing a toy. I was. Francie, if you... Let's not... Actually, that... Yeah, let's not go down that road.

Joe Patti53:07

If you let them be YouTubers, they're never moving out. You realize that?

Francie Dudrey53:12

No, I know.

Chris Roberts53:12

They're moving into a bigger house.

Francie Dudrey53:15

They're moving all together. A creator house.

Adam Roth53:17

Look at those guys. Jake Paul and those guys. I mean, how much? Ten to millions of dollars they're making.

Francie Dudrey53:22

I don't know how that guy makes money.

Mike Pedrick53:25

Let me go back to the AI stuff for a moment.

Francie Dudrey53:29

Is it Chris getting us back on task?

Mike Pedrick53:33

No, not really because there is an ulterior motive to this and you'll figure it out in a second. So let me ask a serious question. We know that humanity needs to change in order to be able to effectively basically look out for itself. Yeah question for all of you is how in the past has humanity changed course What things have occurred to humanity to change course in the past in other words? There's only one answer. I'm afraid.

Adam Roth54:05

I know what the answer is. It's got to be a catastrophic issue whether Yeah, catastrophic. Well, it's not just killed. It's not just killing people. It's whether We lose sunlight for 30 years and plants die. It could be almost close to an exterminating event. We haven't had it recently, but it could be something where Humanity might cease to end, a nuclear war, a world war, stuff like that.

Mike Pedrick54:41

But even that really didn't do it, let's face it, because it was always a threat. It didn't actually, you know, we dropped a minimal amount of, unfortunately, we still dropped bombs on people, but we didn't drop enough, to be honest, to be able to deter people. We dropped enough to make people go, well, we need it too. I'd argue the same thing with AI at this point in time. Everybody wants it. I mean, let's face it, when ChatGPT was released to humanity, five and a half billion people on the planet went, woohoo. There were some people who went, woohoo, great, I can play with this. There were some people who were like, whoop, I don't want to do anything with it. And there were other people who were like, ha, ha, ha, ha, ha, ha, and used it to attack others. So fight, flight, and freeze. And so I hate to say it, until we actually demonstrate, same thing with security. People don't care about security until something happens.

Adam Roth55:34

Right. And that's actually very similar to what you're saying.

Mike Pedrick55:38

I know I can see you. Go for it. I'm listening. Go ahead, Mike. I see it.

Adam Roth55:42

It's ready to go. It's ready to burst.

Chris Roberts55:49

I'm usually disinclined to disagree with Chris. And I don't know that I'm disagreeing with Chris, right? But I am going to say this. I agree wholeheartedly that suddenly we had a race for everybody to have nuclear weapons. But the MAD doctrine was in itself a deterrent. We all have them. We are all going to die. Therefore, none of us are going to launch a nuclear weapon, right? But I want to step back from that because I want to expand a little bit on places in our known history, our documented history, which is really not that long, right? I'm not talking tablets and hieroglyphs on walls because I just don't think there's high reliability there.

Mike Pedrick56:32

No, we're talking 9, 7,000 years and that's it really, a couple of thousand years.

Chris Roberts56:35

Right, exactly. I think that the direction of humanity is either changed by calamity or catastrophe or some major global event, pandemic excluded, or a driving holistic interest in a thing, right? The industrial revolution springs to mind, right? That was game changing. It wasn't a calamity. Asterisk, right? But the drive for the increase in productivity was there, right? But here's the other thing that I would say about this, right? Is that nothing motivates like calamity. Think of all the things that we invented in the 20th century that we couldn't have even conceived of for 500 years prior, right? on a long enough curve or on a specific curve, we have stagnated, we haven't invented, we haven't innovated to the extent that the entire globe did, or much of the globe did during World War, or in the years between World War I and World War II. So bringing it back around, I'm thinking to myself, AI, currently, not that they could dismiss a view of it, is really just, it's in its infancy. It's not Skynet. It's not something you'd read about in a Philip K. Dick novel or Isaac Asimov's Four Laws. It is not that yet. It is a really sophisticated, really power-consumptive series of if-then-else statements and machine learning, basic fundamental machine learning. And a data lake. And a data lake. Gotta be a data pond at the very least, right? A data puddle? A data estuary? Data puddle? Data swamp.

Mike Pedrick58:41

It's a swamp for crying out loud. Let's just be honest about it.

Chris Roberts58:43

That's right, yeah.

Mike Pedrick58:45

Alright, so Mike. Oh, sorry, go on. I want to add to it.

Chris Roberts58:49

I'm getting close to the end. Go for it.

Mike Pedrick58:51

No, I like this. So here's my challenge on that one, though. I just looked up a stat. So back around the 19th century, there were about a billion people. Here's the challenge. That was a billion people who were connected over days or weeks worth of news that eventually filtered out. Now we have eight billion people with five and a half, six billion connected people. Each one of them has got their own independent view, and they get news immediately. And so I think that's the interesting one as well, is when news filtered out slower, was disseminated more civilized, and it was limited. You had a limited scope, a limited number of people, and a limited focus on it. We were able to actually come together and probably have more collaborative views. But now you've got five and a half billion connected people talking to, in total, eight billion people on this planet with six billion different views. And everybody wants their own piece of pine. Everybody wants to be right. That's where I think we have a problem as well.

Chris Roberts59:51

Sure, but it's an individual feed to a woefully small number of news sources in real time.

Mike Pedrick01:00:01

Yes, that's true.

Chris Roberts01:00:03

The faces change, the names change, but the message is largely the same.

Mike Pedrick01:00:10

Yeah, that's fair.

Adam Roth01:00:11

Just to divert for one second. Meanwhile, we have all these connected devices and all this AI. I thought 20 years ago we were running out of IPv4 addresses.

Chris Roberts01:00:21

What happened? Oh, God. Yes. IPv4 addresses and helium were going to be no longer a thing right now, right?

Adam Roth01:00:31

I got to thank God for IRC 1918. Right.

Joe Patti01:00:39

Okay, well, I think we're getting towards the end here. And you know what? I'm trying to end on a more hopeful note these days.

Francie Dudrey01:00:47

I know, it's like, whoo, this got me.

Mike Pedrick01:00:49

No, no, get everybody off that.

Francie Dudrey01:00:52

Talk about Chris's pickles.

Joe Patti01:00:54

Well, I'll tell you what.

Francie Dudrey01:01:00

Sorry, Joe.

Mike Pedrick01:01:01

These are really good listening devices, and that's all I'm going to say.

Joe Patti01:01:06

You know what, if there is some global calamity, and in 10,000 years some archaeologist, like, you know, picks a pickle, and is going to say, like, what was this? It must have been an object of existence or something. Oh my god. What's the meaning of this, you know? It's going to be a shrine to the pickle.

Francie Dudrey01:01:24

That'd be crazy.

Adam Roth01:01:25

By the way, if something happens to me, or I'm no longer seen, it probably wasn't aliens.

Chris Roberts01:01:33

Probably was not.

Joe Patti01:01:37

Anyway, so last night I was watching YouTube, as I often do, and I saw this little Star Trek fan film you were making. It was like the story of, you know, Khan, like in The Wrath of Khan, the guy who was the... Khan! The guy, you know, Khan! Yeah! Who's the evil warlord who brought, you know, war and destroyed the world almost? And the Federation was founded. Well, they did this thing and they said, well, yeah, you know, according to Star Trek, when they made them in the 60s, Khan, like, you know, did all this bad stuff and left the earth in 1996. And he went back to Fantasy Island. Yeah, exactly, with the other guy. But you know what? That's like going on 30 years ago. So maybe there is some... We've at least managed to exceed the dread that they thought we were going to have in the 60s.

Mike Pedrick01:02:34

We got past the 16th, we got past Y2K, we got past all these other little bugs that we forgot about in computing systems. I think there's another couple coming up in the next few. I keep seeing ones like Linux, Unix ones, and mainframe ones, but we're getting there.

Adam Roth01:02:51

Well, the next bugs that we're going to have are going to be more around protein and organic, and they're going to go from our brains to the computers and back. It's all good.

Mike Pedrick01:03:02

I'll take a different act. I've spent the last couple of years, and I'm still messing around, but I haven't had a chance since I've been out in Missouri. I had built, and I've got to resurrect it, a version of me that is purely digital. I was monitoring brain signals for several years through a different set of glasses, these ones. And I was pulling out signal intelligence, pulling out data intelligence, and all this other kind of good stuff. And the logic was, was basically to build a digital version of me. I got it to the point where I knew I wanted tea and coffee, tea before I wanted tea, wanted mail. It could recognize me as I walked up to the computer and a whole bunch of other things. Now the question becomes, one is, when does that become sentient and on its own? And if it is, it's digital, which means I can send it by wire. I can send myself via signal. I don't have to be connected. So you want your aliens, bugger waiting for them to come to us. I take a digital version, one of me, and I send it across the airwaves outside of this planet.

Adam Roth01:04:02

Why don't you do that for Monday when you start, when you work your job?

Mike Pedrick01:04:07

I don't think it's going to confuse them, but let me solve the deepfake stuff first, and then I'll play with this. They have a huge lab. I am looking forward. I'm going to be taking over the lab. Let's just put it that way.

Adam Roth01:04:20

Oh my god. Can I be your intern for free?

Mike Pedrick01:04:24

Oh, I'm looking forward to this. And the pickles will be joining me.

Francie Dudrey01:04:27

The pickles should join you.

Adam Roth01:04:29

Awesome.

Mike Pedrick01:04:30

All right.

Adam Roth01:04:30

Oh, deepfake pickles. Deepfake pickles. Deep fried deepfake pickles. I think we've got to the end here, huh?

Joe Patti01:04:40

Yeah, well, I'm going to be doing something very cool. I can't wait to see what happens with that. And there is much more to be written on this AI stuff. It's not going anywhere for now. We've got to figure all this out.

Mike Pedrick01:04:53

If we can figure out how to power the damn thing effectively, we'll get there.

Francie Dudrey01:04:58

Yeah.

Joe Patti01:04:59

Seriously. Okay. Well, everyone, thanks so much for joining. This has been a blast. Francie Dudry, Mike Pedrick, Chris Roberts, and my usual cohort, Adam.

Adam Roth01:05:12

Before we go, if anybody wants to be on the panel as a good idea, please reach out to us. We'd love to do this again. And if you want, we can maybe bring back some other guests. So let us know.

Francie Dudrey01:05:24

Panels are fun.

Joe Patti01:05:25

Yes, and thanks to all of our viewers for hanging out till the end.

Adam Roth01:05:31

We'll have more. You should have done like Deadpool, where people, we sit there. Don't spoil it, I'm going to see it this weekend. No, no, this happens in every episode. No, no, you sit there, we wait, we wait, we wait, we do the credits, we do the credits, we do the credits. Only after credits, see, okay. Oh, you're still here? Wait, clean up your aisle.

Francie Dudrey01:05:52

Put your food in the bag. But the pickles would say it.

Joe Patti01:05:54

There we go. Okay, for the after credit scene, I think maybe I will do an AI pickle animation if we can figure out how to do that.

Mike Pedrick01:06:05

There's your pickle.

Joe Patti01:06:08

Great.

Francie Dudrey01:06:08

I'll send you a pickle. Oh, this is fun.

Joe Patti01:06:10

Awesome. All right. Thanks, everyone. It's great seeing you guys. Thank you. Thanks for having me. Take care, everybody.