376: Real-Life Infinite Money Glitch?! (AKA: Check Fraud for Beginners)

Episode 376 September 05, 2024 01:14:24
376: Real-Life Infinite Money Glitch?! (AKA: Check Fraud for Beginners)
Technado
376: Real-Life Infinite Money Glitch?! (AKA: Check Fraud for Beginners)

Sep 05 2024 | 01:14:24

/

Show Notes

Yubico security keys can be cloned, D-Link isn't fixing critical router flaws, and an Ohio city is suing a researcher for colluding with a ransomware gang...but not all is as it seems. This week on Technado, it's time to debunk the clickbait.

View Full Transcript

Episode Transcript

[00:00:04] Speaker A: You're listening to Technado. Welcome to another episode of Technado sponsored by ACA Learning. The folks behind it pro reminder that you can use that code, Technado 30 for a discount on your it pro membership. This is another episode of Technado. And I today am a tornado of anger swirling about, and I am looking forward to getting into these articles because some of them are interesting. Some of them just make me angry. What say you? [00:00:30] Speaker B: What I say is it's fun to be back. Welcome, everyone. Thanks for joining us today is. Yeah, she was a whirling dervish of anger this morning over today's articles, and I was like, cool, I'll just put a couple of articles in. We'll just let you take the lead on these because this is going to be kind of fun. She's wound up like an eight day clock. [00:00:50] Speaker A: Every day I get more radicalized on certain issues. I just. I had a. [00:00:56] Speaker B: She's black billed. [00:00:58] Speaker A: I kind of. I'm getting there for some of these things. I had an experience, I guess, last week, this week where somebody reached out to me and asked if I'd be willing to lend my likeness to an AI avatar. I won't name the company. [00:01:11] Speaker B: Yeah, you told me about that. [00:01:12] Speaker A: Yeah, a service, basically. They had several services that they were planning on offering. It's like a startup where you would be able to talk to, like, an AI avatar for, like, interview coaching and things like that. I'm like. [00:01:21] Speaker B: And other things. [00:01:22] Speaker A: Right, okay. [00:01:23] Speaker B: Which you were worried about. [00:01:24] Speaker A: Well, yes. Yeah. So I wasn't quite sure exactly. And then, of course, payment is weird. Like, with AI stuff, this is all so relatively new that it's hard to know. And then how do I control whether they use my likeness for certain things? [00:01:35] Speaker B: Right. There's not, like, an established road for these things. [00:01:39] Speaker A: It's kind of a wild, wild, wild west. [00:01:41] Speaker B: Yeah. [00:01:41] Speaker A: So I heard him out because I didn't want to just immediately be like, no, that's stupid. I was like, I don't think I'm gonna do this, but let me hear what you have to say. And they didn't want to use my voice, just my likeness. So I feel like that's even more. [00:01:52] Speaker B: Like, this is getting sketchier. [00:01:54] Speaker A: Where's this going? And one of the uses was like, oh, if companies don't feel like conducting first round interviews, they can just have AI do it. So you click the link to go to your zoom interview, and it's me standing there as an avatar. Like, welcome. Please state your name clearly and tell me about your experience with Photoshop, and then the AI gives the information to the employer, doesn't include any information about you, your demographics, whatever. Just gives a summary of your answers and what it thought about you. I analyze the pitch as being this. The tone is being this. I'm like, that, to me, is dangerous. I feel like there's so much that comes in a human interaction that you can't get putting it through an AI filter. [00:02:27] Speaker B: So honestly, with, like, flux starting to really come of. Into prominence and the what it can do for these image creation and even video creation. I've seen some flux videos now, and it's like, yeah, I can still tell some weird artifacting occurs. [00:02:43] Speaker A: Yeah. [00:02:43] Speaker B: But it's. It's pretty damn good. It's getting very, very close. Yeah, images. I mean, you saw some of the images I was showing you from flux. It was insane. [00:02:51] Speaker A: Yeah. [00:02:52] Speaker B: And the video is getting. They're getting much, much better. It's like, do we need people anymore? Are we just going to be the Wally people floating around in chairs? Like, yeah, we're bone density. [00:03:02] Speaker A: We're going to be just stripped of our humanity because, like, everybody's a robot. So what? Like, yeah. Did you see the somebody use, I think it was a chinese AI that was, like, really good at generating realistic video. And they put in, like, a prompt, and it was supposed to be like Star wars, and it also generated audio. And so you look at it, and it's. It is pretty good, but they've got their lightsabers, like, down low. [00:03:25] Speaker B: But we're, what, two years away, maybe. [00:03:27] Speaker A: Yeah, it's. It's pretty. The more that I learn about it, and especially now for this experience, seeing, I'm like, this is crazy. And they were like, oh, you know, maybe like a one time payment, but I'm like, for you to use indefinitely? Forever. Yeah, sorry. No, I already didn't want to do it, and now I'm like, okay, definitely I was on the right path to say no to this. So anyway, so I'm becoming more and more black pilled every single day, and it's great. And we're gonna have some more pieces later on AI in the show that are gonna just piss me off more, but I have more things to get angry about. So you have that to look forward to. But, of course, we do have. [00:03:58] Speaker B: Well, let's poke the bear. [00:03:59] Speaker A: Yeah, it's been poked. We're way past that. But of course, we do have the segment that we usually like to start the show with. So let's go ahead and get right into it. Breaking news. Breaking news. Thank you, Christian, for your service. [00:04:15] Speaker B: You guys could really hear that. [00:04:17] Speaker A: He really gets into it. He puts his heart into it. [00:04:19] Speaker B: He does. [00:04:19] Speaker A: He could be a voice actor. We've got a couple on the chopping block table, whatever you want to call it. Today. Google patches actively exploited Android zero day privilege escalation vulnerability. It's the longest headline. Seems like a pro in the history of the world. But yes, patched. So that's good news. But actively exploited Android zero day would of course be the slightly, uh oh part. [00:04:38] Speaker B: Yeah, yeah, right. [00:04:40] Speaker A: Just slightly. [00:04:41] Speaker B: Android phones are fairly popular. Yeah, I understand. [00:04:45] Speaker A: I would say they're one of the main two. [00:04:48] Speaker B: Anything about that one? Don't. Never mind. Turn the camera. But yeah. [00:04:55] Speaker A: And it's privileged escalation. [00:04:56] Speaker B: Yeah. So this is the only kind of saving grace on this is a previsc vulnerable vulnerability. So you've already been compromised. [00:05:05] Speaker A: Right. [00:05:06] Speaker B: So you can't just go, yay, privilege escalator. You got to have privilege first. [00:05:12] Speaker A: Right. [00:05:13] Speaker B: You escalate. Yeah, there's that. But it is being actively exploited. So that means if I say, you know, oh, you got to have access to the phone. There are threat actors out there that are pretty good doing that. [00:05:25] Speaker A: Yeah, you know, that's true. I mean, if somebody really wanted to, they could. It's not impossible. Says the flaw, allows for local escalation of privilege without requiring additional execution privileges. And looking at the severity score, it says it's got a base score of 7.8, which is high. [00:05:42] Speaker B: Right? [00:05:43] Speaker A: Pretty high. [00:05:43] Speaker B: Pretty high in the high range. Not critical. [00:05:45] Speaker A: Right, right. Not quite critical but definitely high. And so if we look at like the impact and exploitability. I didn't realize if you click on it, it'll give you like all these charts. [00:05:52] Speaker B: More details. [00:05:53] Speaker A: Yeah, yeah, that's pretty, like I was just hovering over it before to like try to scroll and get the details, but then it gives you like an explanation of. [00:05:59] Speaker B: Oh yeah, tons of. Tons of. They're actually trying to inform you about the problem. [00:06:05] Speaker A: Which is nice. Yeah, which is nice. We do love to see that. So 7.8. Not, not anything to sneeze at but not quite critical. That's good. [00:06:14] Speaker B: What? It's just one of many. [00:06:16] Speaker A: Yeah, there were. [00:06:16] Speaker B: And there were a couple of criticals. [00:06:18] Speaker A: That was like the main one and there were a few others. Three that were high. Five that were high. [00:06:24] Speaker B: That was kind of funny, man. There were three. They were high. [00:06:28] Speaker A: Me, Daniel and Christian. [00:06:29] Speaker B: I'm just kidding. 420. Most. [00:06:31] Speaker A: Most of these are high ranking in severity, but there are two critical ones as well. So definitely quite a bit that came out of this. [00:06:38] Speaker B: And this is a breaking news. So this is as of today. Like this article is as of today, which is the 4 September when we film this will be released tomorrow, which will be the fifth, obviously. [00:06:49] Speaker A: Oh, that's interesting. [00:06:50] Speaker B: What's that? [00:06:50] Speaker A: Because when I click on the links that are shown for the later in the article, it's got a list of four and two of them are critical. When I click on those links and I go to the page, it's 7.87.8. They're both listed as high severity. [00:07:04] Speaker B: So, interesting. Have you changed the CVSS scoring scale or system? Yeah, I went to three, four and. [00:07:10] Speaker A: Nothing for four, nothing for two and then three says 7.8. [00:07:14] Speaker B: Wonderful. [00:07:15] Speaker A: Has their own ranking system. [00:07:16] Speaker B: They do. [00:07:17] Speaker A: That they use and so they deemed it critical for them. [00:07:20] Speaker B: Yeah, that is. [00:07:21] Speaker A: Yeah. [00:07:22] Speaker B: What are these references? This a 344-62-0519 I don't know. [00:07:26] Speaker A: Maybe. [00:07:27] Speaker B: Maybe that's a Google thing, the CvE stuff. [00:07:28] Speaker A: Maybe it is a Google thing. So, yeah, that one that was actively exploited, I think was the. That was the heavy hitter, but other ones as well. And of course there are patches, so that's always good news. [00:07:38] Speaker B: Yeah. My phone did not tell me I had a patch, though. Oh, right. [00:07:41] Speaker A: Maybe it's too late for you. [00:07:43] Speaker B: Yeah, I've already done too late. [00:07:45] Speaker A: We do have some other stuff in the world of vulnerabilities and such. This one was. I don't think I've seen anything like this before. Not that it hasn't happened. I've just never seen anything like it. Vulnerability allows Yubico security keys to be cloned. That's interesting. [00:07:59] Speaker B: Yeah. So did you read the article? [00:08:02] Speaker A: I just skimmed it because this was. [00:08:03] Speaker B: I just skimmed it. [00:08:03] Speaker A: I just tossed this in this morning. Yeah, it's breaking. [00:08:05] Speaker B: This is basically a nothing burger. [00:08:07] Speaker A: Really? [00:08:08] Speaker B: Yeah. [00:08:08] Speaker A: Okay. [00:08:09] Speaker B: We ultimately. We should probably come up with a. Right. Just a christian dude or something, like a clickbait. Oh, clickbait verified. [00:08:18] Speaker A: Yeah. Yeah. [00:08:19] Speaker B: Right. Because this seems a bit clickbaity because it's like vulnerability allows Yubikey security keys to be cloned. This would be. This is a horrible day at the beach. [00:08:27] Speaker A: Yeah. [00:08:28] Speaker B: Right. If I can clone a Yubikey. Screw you. Right? This is bad. But when you read the article, it's like, well, yeah, you can clone a Yubikey, but you gotta have like a clean room and access to the key of specialized software and technical knowledge. [00:08:46] Speaker A: The blood of a goat. [00:08:47] Speaker B: Yeah. Yeah. You know, the stars, the moon's gotta be in the 7th house. And Jupiter aligned with Mars. [00:08:53] Speaker A: It'll only work if you're a Gemini. [00:08:54] Speaker B: Yeah. It's very, very specific. So, like, you're not unsafe. [00:09:00] Speaker A: Okay. [00:09:01] Speaker B: Right. Your Yubikey is still safe. The necessary components for someone to be able to pull this off. While, yes, it is possible, it's a non zero possibility. It's just very unlikely. [00:09:14] Speaker A: Yes, it's a side channel vulnerability, but it went unnoticed for 14 years. So that. And as far as I think it. [00:09:21] Speaker B: Even said that you've. You've got to know what it is, even though. So if I'm using my Yubikey to access Facebook or whatever, you got to know that that's Facebook's key. And there's a lot of information that you've got to know about what's going on right. Before it actually works. [00:09:38] Speaker A: Right. And that's what I'm saying is, like, there's a reason it went 14 years without anybody picking up on it, because it's like, well, what are the odds that that's really. Everything would have to. The stars would have to align so perfectly. [00:09:47] Speaker B: Right. [00:09:48] Speaker A: There was just for this to work for somebody. [00:09:49] Speaker B: Obviously, this is a flaw. [00:09:51] Speaker A: Sure. [00:09:52] Speaker B: In the system, and that's cool that the researchers are finding out, but I feel like it's like, you know, some of these things, especially when it comes to a lot of these side channel attacks, the emanations and things of that nature, audio emanations and that kind of stuff, it's very. You have to build these. These environments where everything is working correctly for it all to really work in a way. I'm not saying that it's never happened, that it can't happen. I'm just saying that that's not very common. So it makes it very difficult. So, to me, this is a bit of a nothing burger. Like, your yubikey. Still good to go. [00:10:30] Speaker A: And they did provide a fix of sorts. I think it's, like, for certain devices with certain firmware, it won't work. But it's not like they were just like, there's no way that could happen. We're not gonna worry about it, so. [00:10:37] Speaker B: We'Re just gonna leave it alone. [00:10:38] Speaker A: They did say, okay, yeah, we'll do what we can to plug that hole, but the odds of it. [00:10:42] Speaker B: Right. So at the end of this day. So now it's even harder for it to pull it off and. [00:10:46] Speaker A: Right, yeah, but, okay, so it's good news. [00:10:48] Speaker B: Like, babe, you know, I just wish they hadn't. Their. Their title should have been a little more like researcher discovers law and Yubikey. [00:11:02] Speaker A: That, sure, but don't panic or something. [00:11:04] Speaker B: Like that, but don't panic. Yeah, Yubikey is on it. [00:11:07] Speaker A: But then I wouldn't have. Then I wouldn't have clicked on it. So they did their job. [00:11:10] Speaker B: It's all about getting them clicks. [00:11:11] Speaker A: You got me. You got me helping it. [00:11:12] Speaker B: That's you. Good. [00:11:13] Speaker A: So that's all I've got for our breaking news. But Daniel, I know we're not quite done talking about flaws. Like I said, Yubikey did provide some kind of a fix for those vulnerabilities that were found. But in this case, flaws are not going to be fixed. D link says it is not fixing for RCE flaws in dir 846 w routers. I probably said that wrong, but four flaws not getting a fix. Is this another one where it's like, oh, that seems alarming, but it makes sense that they're not fixing it? [00:11:41] Speaker B: Yes. [00:11:41] Speaker A: Okay. [00:11:42] Speaker B: Yes, it's absolutely that. But it does highlight a very good idea. Right. A very good philosophy when it comes to the security of your system, your organization, your environment. And that is, is that thing sundown? And you need to be on top of that. So ultimately, here's the article in a nutshell. You've got these D link routers, very popular. Apparently not so much in the states, but in other areas of the world it was quite popular. After a while, they stop being supportive because dealing creates new stuff. They don't want you. They want to continue to support things that are x amount of years old. And they probably have that somewhere in their, you know, terms of service and all the other stuff to say, this will only be supported. This is our typical lifespan of a product. [00:12:26] Speaker A: Sure. [00:12:26] Speaker B: So the onus is on you to update and. And upgrade when those upgrades become available. If you're not doing that, I've actually run into this. I was kicking around with an old router that I had at my house and I was like, hold on here, because I was doing firmware dumps and stuff and I was looking at firmware and I was like, I feel like I can just force browse my way to the admin page. And I could, I was like, holy crap. So I started looking, I was like, I'm not the first person to discover this. And I was like, holy crap, I am the first person to discover this. [00:13:02] Speaker A: Wow. [00:13:02] Speaker B: So I contacted, I think it was a Belkin. Rounder contacted Belkin and their response was, thank you. Yep, that worked. We don't care because this is no longer a supported device. So I was like, okay, that's cool. Can I publish my findings? Or like, yeah, go for it, you know, because this is not a support, this is an older system that we no longer manufacture. Nor I said, I get that. Totally cool, right? Again, pointing us back to the idea of if I was running a small office, home office or something like that, and this was my main way of connecting my devices in said office, I would need to be like, okay, after four years, we're going to need to upgrade regardless of whether or not this still works, because it might not be a supported device. And that's why I put this article in, because it highlights that the idea is cool. It's not a bad thing that you bought this device. It's. It's not even necessarily like, here's where it becomes a bad thing, is that there are people out there that don't realize, they think if it ain't broke, don't fix it. I understand that mentality and to some extent that makes sense. And in certain contexts, what you should do. Yeah, this is not one of those contexts. This is one of those things where it's like, this will sundown, and if I'm not paying attention now, I'm part of the Mariah botnet. [00:14:24] Speaker A: Yeah, yeah. [00:14:25] Speaker B: And, and that's a bad thing. And so these are, these are like straight up 9.8 rces, unauthenticated. All the bad things in life we, we hate to see. And d link going, yeah, what do you want me to do about it? That's an old thing. We don't. We don't do that anymore. Yeah, you should go to d link.com if you were satisfied with that d link router. We have other options available that we do support, and you just got to keep that. That is a part of doing good. Security is not just doing patches and updates and sanitizing inputs and all that other fun stuff. It's also, hey, does this thing, is this still supported? Because now it's legacy garbage. Bring in the new. With all the new hotness of encryptions and all the things that that old thing can't do anymore. Right. So you gotta stay on top of it. [00:15:15] Speaker A: When you bring up, like you mentioned, if it ain't broke, don't fix it. And for some things that apply, you know, I've been wearing the same glasses for years. They. My prescription is the same. [00:15:21] Speaker B: Right. [00:15:22] Speaker A: Why would I go buy a new pair when these work for me just fine. [00:15:24] Speaker B: Exactly. [00:15:25] Speaker A: But yeah, in this case, it's like, well, no, it's not broken, but the security eventually is gonna be broken, so to speak, and you can't really fix it. There's. There is no fix, as I guess. [00:15:35] Speaker B: In this case, it has become a broken state where the device operates as you would expect it to. [00:15:40] Speaker A: Right. [00:15:41] Speaker B: But there are new flaws and vulnerabilities that have been discovered and no plans to patch that. So in that sense, it is now broken. [00:15:48] Speaker A: Yeah, my door works fine. The lock is broken. The lock. Replace it. [00:15:53] Speaker B: Yeah. [00:15:53] Speaker A: And it's time for a new door. Like, you know, so it's. It's. Maybe your device is still working fine, but. Right, right. You gotta keep up with this stuff just for your own safety and security. So, yeah, a couple of these were critical. The other one was high severity. But again, if you're not using outdated stuff, nothing to worry about. [00:16:08] Speaker B: Nothing to worry about. [00:16:08] Speaker A: So. But good that you brought that up. Cause it does raise a good point. Now, this next one. Oh, I've missed this segment. And this is where I start to get angry. Also, this article's gonna. Gonna make me a little bit fun. It's gonna make me a little upset. But this is part of one of my favorite segments, dope do Ramey Faso, Latin do so doh. Wow. [00:16:30] Speaker B: I thoroughly enjoyed his dough way more than yours. [00:16:34] Speaker A: Christian. Christian might have me beat. That was pretty. He sounded like he was in pain. [00:16:37] Speaker B: That was like he was like he was putting the pressure on a certain piece of himself. [00:16:43] Speaker A: You're okay out there. [00:16:44] Speaker B: It was like, no. [00:16:48] Speaker A: So if you are active at all on any kind of social media, especially on TikTok, this has been trending recently. Chase warns against using system glitch to filch cash. I don't like that word, Filch. But regardless, it's fraud, plain and simple. People thought that there was some kind of a real life infinite money glitch in Chase bank's accounts where, oh, I can deposit a check that I don't really have the money to be depositing, and then it'll hit my account immediately. There's no delay, and I can go to the bank right away and withdraw all this money. So they were writing themselves checks for 1020, $30,000, depositing them, and then immediately withdrawing the money, going and buying a new g wagon, whatever. And then the next day, they bought. [00:17:26] Speaker B: A new g wagon. [00:17:27] Speaker A: Somebody did. [00:17:28] Speaker B: Somebody bought a new g wagon with this. Wow. [00:17:31] Speaker A: So. And then the next day, they go, my accounts overdrawn, $40,000. What happened? Chase bank fix this? No. Sorry. [00:17:37] Speaker B: No, we did fix it. [00:17:39] Speaker A: The IR's knocking on your door. [00:17:40] Speaker B: We fixed this real good. [00:17:42] Speaker A: Committed a crime. So it was, it was, there was a glitch, so to speak, in that. [00:17:47] Speaker B: Explain it to us. What was the glitch? [00:17:48] Speaker A: So there should be a delay. There was supposed to be a delay between funds being deposited into an account and then them being available. And this same with, like, when I deposit a check mobile, I have to wait a little bit. [00:17:58] Speaker B: Usually take a picture of the check or whatever it takes a little bit. [00:18:00] Speaker A: I can't just deposit $10,000 instant funds. Right. But in this case, there was no delay. And so they were, there was time for them to go to the bank, withdraw this money before the check would bounce. So it's been fixed. [00:18:11] Speaker B: It was instantly post. So if I took a. I wrote a check for $40,000, I slid it in the ATM, it would immediately go cool. I've just incremented your account by $40,000. [00:18:23] Speaker A: Yes. [00:18:24] Speaker B: And then they were going, cool. Withdraw money from the ATM up to the tune of whatever the ATM could give. [00:18:31] Speaker A: Sure. Or at ends. [00:18:33] Speaker B: Oh, right. They could walk in this, in this. [00:18:35] Speaker A: Case, I think it's. Yeah, mostly just. But there were lines around. [00:18:39] Speaker B: Were they bouncing from bank to bank? Because you can only withdraw like, oh, at $10,000. They must get like some kind of a notes. You can't just like pull ten G's out of your bank and walk up to a teller and go, I want $10,000. They have to, like, there's like, there's some weird thing where you got to get that kind of like, approved. [00:19:01] Speaker A: Yeah. [00:19:02] Speaker B: Right. [00:19:02] Speaker A: Yeah. [00:19:03] Speaker B: So they must have been like pulling 9000 something bucks out of each one. Oh, it gets reported. That's what it is. Yeah, it gets reported. Each one. I guess they wouldn't care about that. [00:19:13] Speaker A: Well, now any, well, so, because anybody that did that. Yeah, they were reported, I'm sure, because they're large amounts. So it's like, we better check this out. Especially when it's multiple people. There were lines around the banks at these ATM's. People were then getting in their cars and like throwing money in the air like, like they just won the lottery. I'm like, how? There's stupid and then there's you. How do you do this? I know that sounds mean, but it's like, at this point it's called check hiding. This has been around for a long time. How do you, at your big age, how do you do something like this and think that this is totally. No, it's just a free money glitch. It's not GTA. Anyway, sorry, sorry. But yeah, so there were like, how. [00:19:49] Speaker B: Is it that you're a grown person, what you're saying at your child. [00:19:53] Speaker A: Yeah. [00:19:53] Speaker B: Who doesn't understand normal economics. [00:19:57] Speaker A: Sure. [00:19:57] Speaker B: And how banks work, you grown and you think, oh, if I just write a check knowing that I don't have that money. [00:20:07] Speaker A: Yeah. [00:20:07] Speaker B: And put it in the system, free cash. [00:20:10] Speaker A: And of course, this is, you're not seeing people that are your age going and doing this. It's mostly people my age that fail to understand how this works. [00:20:17] Speaker B: Even if that were like this, let's say the bank, just like, do you think that they're going to let you keep that money? In what world? [00:20:25] Speaker A: They're just going to be like, oops. [00:20:27] Speaker B: Hey, I won the game, I found the loophole, and I get to keep that money. Let me tell you, if that's the world you live in, you are more than delusional, because then banks ain't letting you keep none of that much. Hell, they don't want to let you have your own money. [00:20:40] Speaker A: Yeah, you have to go. I could go make a legitimate, even just to go print paper checks. I got to wait in line for ten minutes and then stand at the counter while they verify my id three different ways. And like, I'm like, how did you manage, I know for now, the, anybody that did this, there are holds on their account, seven day holds. And of course, now chase has fixed this issue. Now where it was immediate, they've, there's a delay now where you install, or you install, deposit a check and you have to wait. But at the time, that was not the case. There was an issue and so there was no delay. [00:21:07] Speaker B: I remember there was a darknet diaries that was about these two guys. Well, this one guy that figured out a glitch in a video poker system, he lived in Vegas and he figured out a glitch in a video poker game, and he was raking into money. He told a friend about it, and him and his friend were just like, they found every place in Vegas where that video poker was and were pulling out. Then they went to Atlantic City or doing the same thing. And then I forget how they found out that they were pilfering these people. Listen, they, they, they broke no laws. They legit found a problem in their software. And the way that it worked that they were able to make it win every time they wanted to and make a bunch of money. They, this ain't like the bank system where like, there was a legitimate problem with the bank. You're not allowed to just make money that way. [00:22:00] Speaker A: Yeah. [00:22:01] Speaker B: They won the game in a legitimate way. And guess what? They did not get to keep that money. Right. Legal came after them. [00:22:10] Speaker A: Yeah. Because it's theft. [00:22:11] Speaker B: It's not theft. They legally won that money. But the people with power and the money to give you that money said no. Right. I don't like how you did it. They are for your money. Like. Like, that's what I said. How is it that you're in this world and not think the people with power and money are going to keep their power and money? [00:22:32] Speaker A: So there was a whole court case about this that. It's on Wikipedia page. [00:22:35] Speaker B: Yeah, buddy. [00:22:36] Speaker A: But they. The court ruled the government's argument failed to sufficiently meet the exceeding authorized access requirement of the CFAA. [00:22:43] Speaker B: Yeah. [00:22:44] Speaker A: And granted the defendant's motions to dismiss. So he had to give back the money, but he didn't end up. There was no legal repercussion for him. [00:22:50] Speaker B: Yeah, but, like, after all that, they had to spend a bunch of money on legal and everything like that. It's like, you're not keeping their money. Even if they can't get it back from you, they'll. They'll squash you. [00:23:00] Speaker A: Yeah. If this guy was like, I doubt he ever was able to gamble again in any major. [00:23:05] Speaker B: No, he's blackballed. [00:23:06] Speaker A: If I'm a casino owner, everybody, right. That's crazy. [00:23:10] Speaker B: You're never stepping another foot, man. It was a crazy. It's been a while since I remember this. It's kind of how I'm remembering it. [00:23:15] Speaker A: But it says it was. Suspicions were raised when he won five jackpots, each with 820 to one odds in under an hour at the Silverton Casino lodge. [00:23:24] Speaker B: Yeah. [00:23:24] Speaker A: So engineers came in. [00:23:25] Speaker B: He went a little crazy. [00:23:26] Speaker A: Yeah. Like, you didn't. You could. [00:23:27] Speaker B: It was. It was the guy he brought in. It wasn't the first guy that found the glitch. I think it was his friend that. [00:23:32] Speaker A: Was, like, John Kane. So John Kane discovered a software bug, contacted Andre Nestor, who flew out to meet him, and it was Kane that won the jackpots. [00:23:40] Speaker B: Gotcha. [00:23:40] Speaker A: That rose eight in a row. [00:23:41] Speaker B: That got the suspicion. [00:23:42] Speaker A: He won five jackpots in under an hour, each with 820 to one odds. [00:23:46] Speaker B: So it was like, 820 to odds. That's so crazy. [00:23:51] Speaker A: What are you doing over there? [00:23:52] Speaker B: Guaranteed. Them casinos, man. Them casinos ain't in the business of losing. [00:23:55] Speaker A: And that makes me wonder if he hadn't gotten greedy. Yeah. And if he had not gotten greedy and gotten, you know, I'm not gonna keep playing and playing and playing. If he'd kept his cool, he could. [00:24:06] Speaker B: Have just, like, lived a nice life. [00:24:07] Speaker A: Just under the radar, had enough money. [00:24:09] Speaker B: Then kept, like, taking money out of these poker systems. [00:24:12] Speaker A: Maybe morally, that's not right. But, you know, I mean, if you're already gambling. Yeah. I doubt you're concerned about, like, well. [00:24:18] Speaker B: If you ever look at the odds for all these different casino games that is so stacked against you. Yeah, right? [00:24:26] Speaker A: So I guess at that point you're thinking like, well, look, already the odds are against me, right? [00:24:30] Speaker B: So it's like card counting at blackjack. It's not illegal, but they sure as hell don't like it. And they find out you're doing it, you're no longer welcome, you're gone. Right. [00:24:39] Speaker A: Interesting. So, I mean, at least that guy, there was a little bit of ingenuity that went on there, a little bit of creativity. [00:24:45] Speaker B: He did it accidentally, was like an accidental discovery. [00:24:47] Speaker A: You found it. [00:24:48] Speaker B: He was just playing poetic or. And he realized, like, whoa. And then, oh, it did it again. He. So he fiddled with it until it did it again, and then he figured out how to make it do it. [00:24:58] Speaker A: Again, and accidental hacker. [00:25:02] Speaker B: And he's like, you just gotta keep doing this until. [00:25:04] Speaker A: That's pretty funny. Yeah, I would have loved to have, like, seen that happen and watch him be like, yeah, would you look at that? He's just some random dude. Said he was a pianist. Like, that was his. He was a piano player. That was his day job, and he was playing the wrong keys. I guess that's interesting. But, yeah, I mean, you gotta figure you're gonna get caught at some point, and it's the same goes back. [00:25:22] Speaker B: I mean, this is simple logic, kids. This doesn't take my son. He's become a neckbeard. It's very funny. Your son, he's three and a half, and his favorite word in the world is actually. [00:25:32] Speaker A: Oh, my God. [00:25:34] Speaker B: You're like, yo, you wanna watch Ninja Turtles? He's like, actually? I wanna watch the Ninja Turtles. That. With this. And actually, Donatello is actually. And I'm like, stop saying actually. What are you doing with actually? How did you latch on to the word actually? [00:25:49] Speaker A: He's gonna start going like this every time he talks. [00:25:51] Speaker B: Actually, I actually took my glasses off and stuck them on the bridge of his nose. I'm like, say actually and push the. We videoed him doing it. [00:26:01] Speaker A: It's great. That's the great thing about having kids. It's like an endless source of entertainment. [00:26:04] Speaker B: He's just like a little puppet you can make dance whatever you want. [00:26:06] Speaker A: Actually, Donatello is the most overpowered Ninja Turtle. [00:26:09] Speaker B: Yeah. It was so funny. [00:26:11] Speaker A: That's cute. That's cute. That's the only time that that conversation is cute. And it's like, aw. The second you have that conversation with an adult, it's like, okay, yeah, real. [00:26:18] Speaker B: Neck beards are just douchebags. [00:26:19] Speaker A: Yeah. No offense to anybody that's watching. Anyway, if you. If you tend to use that, like, well, actually, actually make those arguments, you know? [00:26:28] Speaker B: I know. I had this, actually, that's where he gets it from. I do this. Right. I had this conversation with someone in our comment section. [00:26:36] Speaker A: Yeah. [00:26:36] Speaker B: Because he actually reached out to me. Here we go again. He reached out to me on my YouTube channel, I think it was. Left me a comment, and he said, oh, I didn't know you had a YouTube channel. Right. And so he put a comment in the Technato comments, and it was really cool. His comment was about how he had experience with that certain specific thing that we were talking about. I forget what it was. Oh, it was ofbiz. [00:27:01] Speaker A: Oh, yeah, right. [00:27:02] Speaker B: And he was explaining what type of software that was. And he works in that space. [00:27:06] Speaker A: Sure. [00:27:06] Speaker B: He was really kind of adding to the conversation. [00:27:08] Speaker A: You should be helpful. [00:27:09] Speaker B: Yeah. Dude, there. There was no douchery around any of your comments because you said, I don't want to be that actually guy. [00:27:17] Speaker A: Yeah. [00:27:17] Speaker B: I said, no way. I said there was no douchery around your comments, which is the defining characteristic. [00:27:25] Speaker A: Well, right. Actually, guys, don't worry about being actually guys. They just do it. [00:27:29] Speaker B: They just want to feel superior. [00:27:30] Speaker A: Yeah. Yeah. If you're genuinely trying to be helpful and you frame it that way, like, hey, just. [00:27:34] Speaker B: That is very welcome. [00:27:35] Speaker A: Yeah. Just a pointer. Like, hey, yeah. My take on it. We totally welcome that. Yeah, very welcome. Absolutely. [00:27:39] Speaker B: Yep. [00:27:39] Speaker A: So, yeah, don't. Don't. Don't steal money. Don't check kite. That's just. That's been a thing forever. [00:27:44] Speaker B: Hey, it's illegal. [00:27:45] Speaker A: It's illegal. And also, it's just dumb. Why would you. You're putting yourself in debt. Why? [00:27:48] Speaker B: Just. [00:27:49] Speaker A: But I could go on. [00:27:50] Speaker B: So anyway, what do they say? Play stupid games, wins. In this case, maybe some people win a $40,000 prize. [00:27:58] Speaker A: Yeah. Yeah. Maybe. Maybe some hard time. [00:28:02] Speaker B: Oh, yeah. You like that, g wagon? I'm so glad I. [00:28:05] Speaker A: We'll be taking that. [00:28:05] Speaker B: Yeah. [00:28:06] Speaker A: So we do have. We'll take a break here in a second. Should we get one more in and then take our break? [00:28:12] Speaker B: What are we on here? [00:28:13] Speaker A: We got five left. [00:28:15] Speaker B: What did we do? [00:28:15] Speaker A: Four. We did a couple breakings and we did, it's the can spam thing, which I do have. [00:28:22] Speaker B: All right, let's go to that one and then we'll take a break. [00:28:23] Speaker A: And we'll take a break. [00:28:24] Speaker B: All right. [00:28:25] Speaker A: So a company called Vercada, or Vercada, depending on how you pronounce that, is gonna be paying $2.95 million for alleged can spam act violations. And I think this headline buries the lead a little bit because, yes, there were some cans and the can Spam act, it says what it stands for somewhere in here. But the way they violated the can spam act is because they bombarded customers with promotional emails without giving them opt out choices. And that's annoying and it's not good. And violating the terms of this act by doing that. Yes. But I feel like the bigger thing here is that the FTC is going to require, it's a security camera vendor, ricotta. They're going to require them to create a comprehensive infosec program as part of a settlement because multiple security failures enabled hackers to access live video fees from Internet connected cameras into like, women's health clinics, psychiatric hospitals, prison schools. That to me, I would be more worried as a customer about that. [00:29:16] Speaker B: I didn't even take away the fact that they had the whole bombardment thing. The only thing I took away from this article was secure. Their cameras were accessed. [00:29:24] Speaker A: Yeah. [00:29:25] Speaker B: And poorly secured. [00:29:26] Speaker A: So. [00:29:26] Speaker B: And for them to tell their customer base that we implement the highest levels of security. Yeah, like, that's what I took away. [00:29:33] Speaker A: From this article as a customer. Like, yeah, it's gonna be annoying if a company bombards me with emails I don't want and that's probably gonna piss me off. [00:29:39] Speaker B: I'm the old outlook rule, right? [00:29:42] Speaker A: Spam, spam, trash, trash, trash. Blocked and reported. [00:29:45] Speaker B: Yeah. [00:29:46] Speaker A: But I'd be more concerned about, oh, your cameras were accessed. Like, there's all these security failures that were overlooked that allow these cameras to be accessed by bad actors. And that's, that's probably going to be a more concerning issue to me. To me. What do I know? You know, luckily I'm not ever kind of customer. But anyway, the FTC said they failed to implement basic security measures to protect cameras from unauthorized access. But also, like you said, misrepresented security to customers by making all these promises, reviews, quote unquote, that were submitted by investors to be, to basically give it some credence like, oh, yeah, this is great. The security is great. [00:30:22] Speaker B: So you're those are called lies. [00:30:24] Speaker A: Yeah, you're lying. You're lying is what you're doing. You can wrap it in a pretty bow, but you're lying. But I just thought it was interesting that they led with the can spam thing in the headline. Like, they violated, oh, they gotta pay all this money. And then also all these security issues. You better hope you weren't in a women's clinic when these cameras, like, also ran. And this has been going on for a couple of years. It was back in March 2021 that there was. That there was initially this group of hackers that leveraged a vulnerability to get admin level access. So it's like, this is not a new thing. It's been ongoing. [00:30:54] Speaker B: Well, not only that, so a flaw in their legacy firmware build server. Oh, so before that incident in December 2020, a hacker exploited a flaw in a legacy firmware build server within Burkata's network and installed Mirai to use it as a DOS platform. Yeah, there's that. That's fun too, right? That's like, their ability to put security into their systems was same as my ability to, like, breathe in space. [00:31:23] Speaker A: In space, no one can hear you breathe. [00:31:25] Speaker B: The o two quality is very poor. [00:31:30] Speaker A: You could try for about half a second. [00:31:32] Speaker B: I'm having trouble. [00:31:33] Speaker A: Yeah, yeah. And you probably would freeze before you can get a breath out. Right? [00:31:36] Speaker B: It's cold. Plus, the vacuum of space isn't very good for your body. You start to expand. There have been people that have been exposed to the vacuum space. It's crazy. [00:31:45] Speaker A: You know, you can't wear. [00:31:46] Speaker B: I've never read the accounts of that happening. It's flipping nuts, really, of people being exposed to, like, super high altitudes or even orbit, like, in space. [00:31:59] Speaker A: Wow. [00:31:59] Speaker B: It is nuts. [00:32:01] Speaker A: I was reading something that was like. [00:32:02] Speaker B: They survive it, supposedly. [00:32:05] Speaker A: Like, you can't. There's certain undergarments you're not supposed to wear in space because it's too constricting on the body and you start to expand and it can, like, actually hurt you. So. Yeah. Astronauts can't wear boxers, I guess. Yeah, that's crazy. [00:32:18] Speaker B: I just feel like the king of the world up there. Yes. I love space. [00:32:26] Speaker A: The jokes are coming, but this is a PG. [00:32:28] Speaker B: I'm not allowed to say them, though. [00:32:29] Speaker A: It's a PG 13 platform. [00:32:31] Speaker B: You know what's happening inside. [00:32:32] Speaker A: You know what I'm talking about. [00:32:33] Speaker B: It's all up here. Beaming it to you. [00:32:35] Speaker A: Yeah. Oh, it could be. Yeah, she knows I do. Oh, I've heard it firsthand. Anyway, so back to Vergara. Or nothing that bad. So the camera vendor did not realize that initial compromise you were just talking about with the dos until a couple weeks later. And it was when AWS said, hey, there's some suspicious activity going on. Check this out. [00:32:55] Speaker B: They went, what's this? Yeah, right here. [00:32:58] Speaker A: This is a little sus. [00:32:58] Speaker B: This region. You might want that out. [00:33:01] Speaker A: Don't love that. [00:33:02] Speaker B: Not a fan. Not a fan. [00:33:04] Speaker A: They claim to use for cotta does the best in class. World class, perhaps. Data security tools and best practices to protect customer data. Verkata is deceptive and does not represent the truth. In other words, they lie. [00:33:16] Speaker B: I guess they were banking on the fact that, like, best in class and world class were a little too subjective to be nailed down on, and the. The government thought otherwise. [00:33:27] Speaker A: Best in class. If it's a first grade class, that's, you know, it depends on your perspective, right? [00:33:31] Speaker B: Yeah. [00:33:32] Speaker A: So they did not implement basic security measures like demanding the use of complex passwords or encrypting customer data at rest. So just basic stuff that you should probably be doing, especially if you're a big company like that that has access to video feed like that. The thing that happened back in March 2021, they were able to access these hackers. Attackers were able to access 150,000 live camera feeds. So this wasn't even, like, a couple isolated incidents, one building, you know, one organization, 150,000 across. I'm sure thousands of organizations didn't say. [00:34:00] Speaker B: They also were able to leverage those video feeds to see sensitive information from those customers. So, like, the video camera can see my desk, and I've got Pii or whatever sitting on there, and they. Oh, what's that? [00:34:14] Speaker A: Several gigabytes of video footage, screenshots, and customer details. That's great. Awesome. I have so much hope in my heart for the future, but, yeah, in this case, the FTC is basically saying, like, ricotta, it's time to. You have to pay. So we don't agree, but, okay. We don't agree with these terms, with these allegations, but we'll accept the terms of the settlement by city hall. Right? Yeah. Hey, just so you know, you know, we're just. We're just settling because it's like, well, what are you gonna do? But we're right on this one. So, anyway, I just thought that was interesting. They kind of buried the lead a little bit with the headline. [00:34:46] Speaker B: They did not. [00:34:47] Speaker A: That 2.95 million is small potatoes, but I feel like that's not the main issue. [00:34:50] Speaker B: I ain't got that kind of money. [00:34:51] Speaker A: I certainly don't. I don't think I have $2.95. So, you know, that's. Yeah, that's. [00:34:57] Speaker B: Don't actually pay us here, kids, gratis. That's right. [00:35:01] Speaker A: We will. We will go ahead and take a break so that I can recompose myself and get angry all over again when we come back. So don't go away. We'll be right back with more here on Technado. Hey, I'm Sophie Goodwin, edutainer at ACI learning and subject matter expert for our new course, cybersecurity fundamentals. If you're new to cybersecurity, this is the course for you. Anyone from high school students to professional switching careers. These episodes are designed for you as an introduction to essential security terms and concepts. So, well, walk through security principles, governance, risk and compliance, access controls, threats and attacks, incident response, network security, and, well, look at some best practices for security operations. Security doesn't have to be scary. Check out cybersecurity fundamentals in the ACI learning course library. Welcome back. Thanks so much for sticking with us through that break. We got to revisit our conversation from the other week about months and ages, and it was. It was riveting. We had quite the heated debate, but definitely unfruitful, though. Yeah. Didn't really reach, like, a definitive final, you know, conclusion, but it was good. Good practice in debate, I would say. It wasn't practice in problem solving and critical thinking. So, speaking of problem solving and critical thinking. Not really. That was a terrible transition. This article has nothing to do with that. Daniel, I might be looking to you. [00:36:22] Speaker B: I'm going to see if I can't defy that logic. Okay. [00:36:24] Speaker A: Okay. Well, yeah, challenge accepted. I might have you take the lead on this one. Docker OSX image used for security research hit by Apple. DMCA takedown. Ooh, boy. Legal stuff. This is legal stuff? [00:36:37] Speaker B: Yeah. Legal interpretation of the law. Because what else would you be legally interpreting? [00:36:44] Speaker A: It's french. [00:36:45] Speaker B: Now, the law. So there is a user out there. Sick codes. [00:36:52] Speaker A: Okay. [00:36:53] Speaker B: Security researcher. And they made a Docker image available in Docker hub of Mac OS X. [00:37:01] Speaker A: Okay. [00:37:02] Speaker B: You're familiar with Mac OS X, I'm assuming, right? [00:37:05] Speaker A: I don't use it. [00:37:06] Speaker B: You've heard of this little company. Yes, we call Apple. [00:37:08] Speaker A: I've heard of it. [00:37:09] Speaker B: They might make it. I think. I think they got some steam. [00:37:12] Speaker A: Yeah. [00:37:14] Speaker B: Here's the thing. Apple not a fan of anyone running Mac OS X on anything other than an actual MACD or their proprietary hardware. Guess what? We don't live in a perfect world. There fruit company and people were able to basically create an image. And if you run virtualbox, you can create a virtual machine of Mac OS X. I don't know the detail. I think I've only ever done it like once or twice, just kind of play around with it. It's don't really like my Mac OS X a lot, so it's not like it's something I'm interested in. So I just did it for the sake of having the experience. But now, apparently sick codes was like, cool. I can virtualize this in a container and then make that container available, and we can use that for the purposes of security research. Now, you don't have to go buy a three $4,000 MacBook or imac or whatever the case is you can. And I'm sure there are other more inexpensive ways of getting into the latest Mac OS X. Sure, but whatever. That's neither here. No, I'm just reporting the facts. That's what I'm doing right now. I don't agree or disagree with how it is used after that or whether or not he could have got it. [00:38:35] Speaker A: Do that information what you will. [00:38:36] Speaker B: I'm just telling you what happened and their motivations behind it. [00:38:40] Speaker A: Don't shoot the messenger. [00:38:42] Speaker B: So, security researchers, this is for you. You go out there, you pull that docker image down. Now you can start looking for security issues. And Apple does have a current bug bounty program or responsible disclosure, however you want to call it. I think there is some nuance between the two things. But regardless, and apparently this has been very fruitful in aiding security researchers, according to sick codes, in their efforts to find security flaws. Report them to Apple and get them patched. Well, the Big Apple machine does what the Big Apple machine does and went, I don't really care how you're using that. You're doing what I told you not to do. And so they got legal involved and they issued some sort of, like, takedown request at Docker Hub based on the DMCA, the digital Millennium Copyright act, that they were violating their DMCA or the DMCA through breaking whatever it is, I guess, encryption or whatever they use that safeguards the code for Mac OS X. Yeah. And Docker hub was like, down it goes. And. Yeah. So now it's a battle between sick codes and Apple on philosophically and legally. Did we break the law? Is this an illegal thing to do? Does the intents and purposes behind it being done negate the law that says you can or cannot do it? Did they break the law and whether or not they did or did not do that, I don't know. All I know, again, reporting the facts. Apparently it's still up in GitHub because I forget why. There were some minutiae reasons why it's okay for it to be in GitHub right now and not in Docker hub. [00:40:36] Speaker A: Okay, a little bit of a difference. [00:40:39] Speaker B: This is the philosophical argument is a, should Mac say, yeah, it's cool to virtualize this for the sake of security research, and then you can go, well, yeah, you can do security research on, but you can also just pirate Mac OS X. I don't know why anybody would want to do that. I'll be honest with you, I am not a fan of Mac OS X. I like their hardware, though. Super nice. [00:41:07] Speaker A: I can tell because, aren't you using? You're on Mac right now, right? MacBook right there for our shows and stuff. [00:41:12] Speaker B: MacBook Air. Yeah, it's m two, I think. [00:41:15] Speaker A: Okay, this is interesting. I'm trying to find the section specifically because in there thing that they sent to sick codes says they were specifically in violation of section 501 of title 17 of the United States Code. So the DMCA. So I was trying to find specifically the wording on that, what it said. And so it looks like it's just anyone who violates any of the exclusive rights of the copyright owner as provided in certain sections, or who imports copies or phono records into the United States. And Violet. So obviously there's a whole bunch of like referred to in these sections as referred. So you'd have to go to each individual, like over here, over here, over here, which I probably could do, just taking a long time, is an infringer of the copyright or right of authors. So it, to oversimplify it, it's a copyright infringement issue. But I'd just be curious to know exactly like a specifics. [00:42:04] Speaker B: Not only that, I would like to know, like, from our perspective, it looks like Apple submitted something to Docker hub and they went snatch it down. Am I, am I guilty until proven innocent in doctor club world? Right. Why didn't they take the opposite stance of, well, you didn't prove that they've done anything wrong. Why should I take it down? Yet you've just made the claim that they've done something wrong. I would assume legal got involved and that legal was like, yeah, yeah, they probably got some leg to stand on. So let's just be sure. Let's err on the side of caution. And it. But did their legal department think that Apple had a case enough that they were like, yes, we need to take this down, or were they, like, we're erring on the side of caution. If that's the case, why didn't they err on the side of they didn't violate that? You know what I mean? Like, I don't know what goes on with legal stuff. I'm not a legal expert. Obviously. [00:43:00] Speaker A: It does say Apple's eula restricts the use of that operating system to Apple branded hardware. So I guess you could make the argument that, like, if somebody's pulling it from Docker. [00:43:07] Speaker B: Yeah, but the MCA is all about, like, if I'm not, again, nothing legal scholar from what I understood. So, like, if I have a DVD, right, and I want to rip that DVD and put on my plex server, right. It is not illegal for me to make a digital backup of my DVD. It is illegal for me to break the encryption on the DVD to make the backup. So you're kind of stuck in this catch 22. Yeah, if I don't. I don't know if there's more nuance. Obviously there is to the DMCA and how they're applying it to this particular instance. And I totally get, like, they have proprietary software, they have intellectual intellectual property that they do not want to be open sourced out to the world for everyone. Again, why would anybody want it? [00:43:54] Speaker A: Yeah, why. [00:43:55] Speaker B: Why am I running a Mac OS. [00:43:56] Speaker A: X virtual machine for other than purposes of security research? Which you would think. [00:44:01] Speaker B: What other purposes am I? I mean, I just because I. Maybe it's just because I hate Mac OS X that bade I legit. Can't think of any good reason. [00:44:09] Speaker A: Yeah, yeah. [00:44:10] Speaker B: To use it. [00:44:11] Speaker A: But it looks like it wasn't just. I mean, the DMCA was obviously, that was what was mentioned in their takedown notice to this person, to sick codes. I don't know the person's real name. But from a legal perspective, in this article, they also mention the end user license agreement that. That apple has implemented. So outside of a us law standpoint. [00:44:29] Speaker B: So they got an eula that they're violating. [00:44:31] Speaker A: They do. So it's a little both. So, okay, if the DMCA stuff isn't enough, our own guidelines that you one, two punch, you're breaking them. So you can't really. So maybe that's why it was like, eh, yeah, one way or the other. Like, they probably have us see, there's. [00:44:44] Speaker B: A logical argument to be made and. [00:44:46] Speaker A: We'Re never gonna get to the. We're never gonna get to the end of this it's. It just. It stems from our conversation we had during our break. There's gotta be a logical argument. And you're right. [00:44:53] Speaker B: Yeah. If there's not, then it's arbitrary and meaningless. [00:44:56] Speaker A: But you're right in that this. You would think all you're really doing is preventing security researchers from finding bugs. And I would think that would be. [00:45:03] Speaker B: Well, I mean, maybe there's an argument that maybe that all you're doing is not just allowing, stopping researchers for finding. And maybe they could come. There's a myriad of ways in which they could invent a system for security researchers to have access without having to go buy a MacBook. [00:45:21] Speaker A: Yeah. [00:45:21] Speaker B: Right. They would just need to go, hey, I want to work with you. I think that's really where it kind of gets frustrating. Is okay, Apple. Cool. You got a leg to stand on. You don't want your stuff. Why don't you create a system that allows people to do security research where they're registered security researchers with you. So if you find them in violation of their agreement with you as a security researcher, you now revoke their privilege, maybe even take legal action against them. There's. There's ways in which you could, like I said, invent systems to give a security researcher easy access to that system so they could find those bugs. And it benefits apple. [00:45:57] Speaker A: Right. [00:45:58] Speaker B: Yeah. Instead, they're just going all ham fisted and telling people, I don't like what you're doing. [00:46:06] Speaker A: Yeah. [00:46:07] Speaker B: Maybe that's where it starts. I don't know. [00:46:09] Speaker A: Don't play with my toys. [00:46:10] Speaker B: Right. [00:46:10] Speaker A: My terms only. So. [00:46:12] Speaker B: Well, it can seem like that's how it comes off. [00:46:14] Speaker A: Yeah. [00:46:15] Speaker B: Like, you got to put yourself in both of their shoes. [00:46:17] Speaker A: Sure. [00:46:18] Speaker B: Right. Apples, it's easy to go. Well, they're a trillion dollar company. What do they care? [00:46:24] Speaker A: Right. [00:46:25] Speaker B: Stay a trillion dollar company. [00:46:26] Speaker A: Right. They do have to, at some point, draw a line. It's their stuff. So, yeah, I'd be curious to know what folks out there think and what their opinion is on this. [00:46:34] Speaker B: It's a very interesting. That's why. That's why I picked this article. It's very interesting. Philosophical and logical arguments, legal arguments to be made on both sides, and I'd be interested in seeing how it plays out. [00:46:46] Speaker A: Yeah. [00:46:47] Speaker B: What did the courts decide? How does that case law now or that precedent transfer into other areas? Cause you gotta remember it probably won't stop here. It'll bleed into other cases. [00:47:00] Speaker A: Yeah. [00:47:00] Speaker B: Right. And sets precedents. [00:47:01] Speaker A: Give them an inch, take a mile. [00:47:02] Speaker B: Right. Who knows? [00:47:03] Speaker A: Be careful. [00:47:04] Speaker B: You gotta. You gotta really worry about that. And so hopefully there's good judge involved. Hopefully there's good legal minds involved. That kind of good, fair, duplicate. Work this out into a. I hate watching you watch these movies and stuff where it's a court case and someone's wrongfully convicted. [00:47:19] Speaker A: Oh, yeah. [00:47:20] Speaker B: It's like, I hate that our legal system is kind of devolved into a way of persecution. It's not about getting at the truth. What do the lawyers always say? It's not what's true. It's what we can prove in court. You hear that said a lot. [00:47:41] Speaker A: Yeah. [00:47:42] Speaker B: I hate that. If. If our legal system is not for getting at the truth, what's the point? It's. It's. It's very frustrating. [00:47:50] Speaker A: Yeah. [00:47:51] Speaker B: Right. I want the truth. Not, I can't handle the truth right now. I can't handle the truth. A hole. [00:48:00] Speaker A: Yeah. [00:48:02] Speaker B: And that's all I want. Nothing but truth. Nothing but truth. [00:48:07] Speaker A: Yeah. Sounds like an outro for Sirius XM, like. Like one of their radio shows, but, yeah. I'd be curious to know, for those of you that are watching and listening, what's your opinion comment? Let us know. Let's hear your take on that. So I have a couple more articles we're gonna hop through real quick. I know we're nearing, you know, the end of our program, but we do have a couple more. [00:48:24] Speaker B: Oh, we are? [00:48:25] Speaker A: Yeah. Well, they do tend to run long, but that's okay. We're having fun. These are. I'm gonna get angry again, so we'll just get right into it. Lovely little segment that we do love. [00:48:35] Speaker B: Sophie gets pissed. [00:48:37] Speaker A: Swirling vortex of anger or whatever is going to be our newest segment. But this is a segment that we do love called pork chop sandwiches. Pork chop sandwiches. Pork chop sandwiches. Oh, Christian. [00:48:52] Speaker B: Pretty sure Christian just had a seizure. [00:48:53] Speaker A: He censored it a little. He did censor himself. That's good. So the city of Columbus is suing a researcher, security researcher, after a ransomware attack. This is a little bit of deja news, but I don't think we covered this. Really. Back when this. When the breach happened, when the ransomware attack happened, city of Columbus, Ohio. There was a ransomware attack that happened back in July. And they basically said, hey, you know, we've got it under control. Told people some things about. Don't worry. Certain data wasn't disclosed. You're all good. Security researcher who goes by the name Connor Goodwolf. His real name is David Leroy Ross, but Connor Goodwolf is his alias, I guess his Batman name. He said, hey, the city's not telling the full truth about this. Uh, the stolen data was. This is a bigger deal than they're making it out to be. Intact name, Social Security numbers, private data. Dealing with police officers and crime victims is serious stuff that got leaked and is available on. On the dark web. The nebulous dark web. Okay. He said, hey, you're not being honest about this. The city said, no, no, no. You don't know what you're talking about. It's fine. Don't worry about it. He said, okay, bet took a sample of this data. Here you go. This is the data that's publicly available, or a piece of it, to prove to you that I'm being honest and I'm trying to tell people the city's lying about this. The city then said, how'd you get that data? You went to the dark web. You're colluding with the gang. We're suing you. And this. To me, they're stupid. And then there's you. Like, I don't. How do you. If I go to the police and I'm like, hey, I have evidence that my neighbor's dealing drugs or whatever, and they're like, bring us proof. Bring us proof. And I say, okay, hey, this bag of cocaine, I found it in his house. And I bring it to them, they're like, you had cocaine? You're working with him. What? I'm bringing you proof that he's breaking the law because you said, no. You can't provide proof. You can't provide proof. This guy's offering a sample of this data. And I guess maybe there's a minuscule possibility that he's Conor Badwolf and he is working with the bad guys and he's a liar. I just find it hard to believe. He's a security researcher. He's trying to be helpful and inform people. [00:50:45] Speaker B: If he's working with them, he's doing a really bad job, heights. [00:50:49] Speaker A: Why would he then bring this data forward? Like, it just. [00:50:52] Speaker B: So. Honestly, what I'm taking away from this and what makes me fearful. [00:50:56] Speaker A: Yeah. [00:50:57] Speaker B: Is that it's a government entity. Correct? [00:50:59] Speaker A: Yes. [00:51:00] Speaker B: That is using their power. [00:51:02] Speaker A: Yeah. [00:51:03] Speaker B: To basically kind of squash the truth of a matter that is negative to them. [00:51:11] Speaker A: Yes. [00:51:11] Speaker B: So that it does not become publicly valid. Give them a bad name in the public. And not only that, they use their power to basically turn the good guy into a bad guy. Right? Yes, basically. Tell me if I'm misunderstanding this. It seems like what is happening is their retaliation or their answer to his. Hey, here is proof that this is worse than what you're saying is to go. You're the bad guy, right? And now we're gonna. We're gonna sanction you. [00:51:46] Speaker A: What does it say to other security researchers out there? [00:51:48] Speaker B: Right? [00:51:48] Speaker A: Okay, fine. I'm not touching that. Yeah, you want my help? Too bad. Deal with it yourself. [00:51:52] Speaker B: So it's not enough to agree with big brother. You must love big brother. [00:51:56] Speaker A: They're in your mind. They're always watching the thing that it's truly like. It keeps referring to it as the city. There's not, like, an individual that is accusing him of this. [00:52:04] Speaker B: It's the city. [00:52:05] Speaker A: The city. The city as an entity. Right. So. [00:52:07] Speaker B: So it tells you it's the system. This is working against him. [00:52:09] Speaker A: But the city, in accusing this guy of colluding with this gang, says they're reasoning for this. This data may be publicly available, but it's only truly accessible to those who have the computer expertise and tools necessary to download data from the dark web. Daniel, correct me if I'm wrong. You have the expertise and tools necessary to access the dark web and download data if you so chose, don't you? [00:52:30] Speaker B: Yes. As do you. [00:52:31] Speaker A: So then you must be a bad guy. [00:52:32] Speaker B: And we are obviously bad people that be putting behind bars. [00:52:35] Speaker A: Yes. Shiny bracelets just for you. [00:52:38] Speaker B: Right again. Are we? So what you're saying is, is that people with the tools and expertise to do it, by the very nature of the fact that they have that tools and expertise, they are the enemy. And is that what you're. Is that what they're saying? [00:52:54] Speaker A: Essentially, yes. [00:52:55] Speaker B: That's what it seems like they're saying. And I won't put words in their mouth. [00:52:58] Speaker A: And they are saying that this. This connor good, Wolf. His actions are an invasion of privacy, even though this data is publicly available. If you know how to find us on the dark web. Right. They're seeking a restraining order to prevent him specifically from accessing that stolen data on the dark web. The guy that's trying to help you. [00:53:15] Speaker B: Well, I love how they frame that. Hold on. Keep your thought. Yeah, that they say if you know how to find it. Sure. [00:53:22] Speaker A: Well, that was my. That was my phrasing. [00:53:24] Speaker B: Oh, you're adding that. Yes, that's your. [00:53:25] Speaker A: I'm saying. Cause it's. [00:53:27] Speaker B: You're implying, so don't imply something they didn't say. [00:53:29] Speaker A: So they were saying it's. [00:53:30] Speaker B: Are they saying it is publicly available? [00:53:32] Speaker A: They said, although the information is publicly available. I'm trying to provide context because it's not like I can type are they. [00:53:37] Speaker B: Insinuating that if you know how to. [00:53:39] Speaker A: Find it, it's publicly available, but only truly accessible if you have expertise and tools? [00:53:44] Speaker B: That's like saying. [00:53:45] Speaker A: That's their phrase. [00:53:46] Speaker B: Hold on. That's like saying that city hall is publicly available if you have the legs to walk up the stairs and access it. We didn't provide on ramps for people wheelchair bound or x, y or z. Like, only if you can get in. It's. But yeah, that doesn't make sense. Yeah, right. [00:54:03] Speaker A: That is their phrasing. [00:54:05] Speaker B: It's either public or it's not. Whether or not I have the tools is not significant to the argument. [00:54:12] Speaker A: Right. And that's their argument, is that it's publicly available but only truly accessible. So they contradict themselves. [00:54:17] Speaker B: Correct. [00:54:17] Speaker A: By saying that it is either publicly. [00:54:19] Speaker B: Available or it is not. All publicly available things require some sort of access. [00:54:25] Speaker A: Well, the idea that it's an invasion of privacy for him to access. What they have confirmed is publicly available information. If I go to the library and read, like, a newspaper, they've got that I can go back and look years past. Is that okay? Just because it's not like, out on the street and I have to go to the library to get it, it's still publicly available to me because I know where to look for it. So what am I then invading somebody's privacy? Like, it. None of it makes sense to me. None of it makes sense. [00:54:50] Speaker B: And to what's not the first time. [00:54:51] Speaker A: This has happened that something like this. [00:54:53] Speaker B: Something like this. Yeah. There was some hacker that was. He found a flaw online. I want to say it was at and t website or something where you can just basically force browse a username or a user id that if your user id was 20 and you put 21, you were now logged in as that person. Right. This is like an eye door. [00:55:16] Speaker A: Yeah. [00:55:16] Speaker B: Right. [00:55:17] Speaker A: Interesting. [00:55:18] Speaker B: And he pointed it out and said, hey, they arrested him, put him in jail just for stumbling upon it and saying something. [00:55:27] Speaker A: That's insane, because it's not like you're. [00:55:29] Speaker B: Going, there's a little more to it than that, but that's kind of the gist of it. [00:55:32] Speaker A: That's crazy. And especially because he disclosed it. It's not like, oh, we found that he was using this to access because then you'd have a leg to stand on as far as, like, he was exploiting this and didn't tell anybody. [00:55:40] Speaker B: When what frightens me is when any kind of government or large, powerful organization starts using their capabilities to say, I know that's true, but shut up. Right, right. That shit scares the crap out of me. [00:56:00] Speaker A: Well, and if every time you try to do the right thing and do something good, it's you get punished for it, right? [00:56:07] Speaker B: What does that teach the other people? [00:56:08] Speaker A: Okay, fine, screw it. I'm not helping you. Like, I'll just stop doing what I'm doing and. [00:56:13] Speaker B: Well, because in their mind, it seems like, in their minds, that is the city or whatever. You weren't helping. [00:56:20] Speaker A: Right. You're just making us look bad. [00:56:22] Speaker B: You're making me look bad, and that's not helping. [00:56:24] Speaker A: But the reason it made them look bad is because they were misleading to begin with. [00:56:28] Speaker B: Yeah. [00:56:29] Speaker A: Just. Yeah. So you can see why. [00:56:32] Speaker B: And then they. And then when they do, they. They paint you as the villain. [00:56:35] Speaker A: Right. Why is it my fault? [00:56:38] Speaker B: Super crazy, right? This is. That is fascist. Like, I hate to pull that out, but, like, that is not right. [00:56:47] Speaker A: And there was when the brown shirts. [00:56:48] Speaker B: Gonna come out next and start shouting us down in the public square. [00:56:52] Speaker A: Have you seen that? [00:56:53] Speaker B: That is crazy talk. [00:56:54] Speaker A: It's. It's. I think it's Hannibal the comedian. [00:56:57] Speaker B: Yeah. [00:56:57] Speaker A: He's like, why are you booing me? I'm right, right? Like, he's just telling the truth. He's just. Yeah, this guy, he was on. I think it was maybe on the Eric Andre show or something, but why are you moving me? I'm right. I'm telling this. [00:57:11] Speaker B: I'll have to watch this. [00:57:12] Speaker A: I'm trying to help you out, and you're telling me that I'm the bad guy? Yeah, it just. Or that I'm working with. Not even that. Oh, you must be an attacker. No, no, no. You're colluding with the gang that attacked us? [00:57:20] Speaker B: Yeah. Huh. You've literally turned me into the villain, then. [00:57:23] Speaker A: I'm the worst. [00:57:24] Speaker B: And I'm like, help you stop this. [00:57:27] Speaker A: If I'm colluding with that gang, they're gonna fire me. [00:57:29] Speaker B: Like, oh, yeah. [00:57:29] Speaker A: Cause I'm doing a bad job. [00:57:31] Speaker B: So I got a hold on. My HR meeting just pulled up with Conti or whatever ransomware gang did this. [00:57:38] Speaker A: It's just the whole thing to me is like, I'm mad on behalf of this guy. [00:57:42] Speaker B: Yeah. [00:57:43] Speaker A: If I lived in Ohio, I'd want to go sit up on this case. [00:57:45] Speaker B: Coal fire, right? [00:57:46] Speaker A: Yeah. Guys, that was when they were doing, like, a physical pen test. Right? And they had given permission or something, and then they got in trouble, even though it was, like, we're supposed to be here. [00:57:55] Speaker B: There was a weird, like, jurisdictional thing where if I'm remembering correctly. It's been a minute where they had authorization for the offices inside the building, but not the building itself. [00:58:07] Speaker A: Yeah. [00:58:08] Speaker B: So they had to break the law to gain access to the offices. And that wasn't really understood by the people that were authorizing them, or a, or b coal fire. And it wasn't until they got arrested and stood before a judge and just said, I don't care what you were trying to do. They were like, judge, we're literally trying to just make this a safer place. She said, I don't care. You're going to jail. And it's like. [00:58:36] Speaker A: I'm not going to live in the woods. Actually, I can't do this anymore. It reminds me a little bit of how you mentioned earlier, it's not illegal for you to make a digital backup of your dvd, but it's illegal for you to break the encryption. [00:58:46] Speaker B: Right. [00:58:46] Speaker A: But you maybe, in theory, have to break the encryption to make the backup. So it's like, well, in order to. [00:58:51] Speaker B: Get the theory about it. [00:58:53] Speaker A: Well, I'm trying to not like, in. [00:58:54] Speaker B: Reality, you have to break the encryption. [00:58:57] Speaker A: So it's like, in order to get to those rooms, I have to breach the building, but I can't breach the building because that's against the law. So it's like. Yeah, it just. Bro, I know if I ever reach into the choir, would have had any interest in being a security researcher, stuff like this would put me off of it. Like, why would I take the risk? So even if this stuff is few and far between, the fact that it happens is scary. So we'll, uh. You know, I'm rooting for you. I don't want to speak for you, but I'm rooting for you, David. Hope you win your case. Cuz that. I think that's b's. [00:59:24] Speaker B: Yeah. [00:59:24] Speaker A: Me personally, we got a couple more. I know we're. We're running out of time here. [00:59:27] Speaker B: A couple more that we're going long winded on. [00:59:29] Speaker A: We're going long winded. [00:59:30] Speaker B: It's very. We get passionate about. [00:59:31] Speaker A: I told you I was Madden. I'm an angry person today. [00:59:34] Speaker B: Well, she flipped her desk and everything. [00:59:36] Speaker A: I'm mad like a hatter. It's great. We got a couple more that have to do with another topic that's got me. I'll try to keep it short, but AI, it's great. It's everybody's favorite clearview. AI is facing a 30.5 million. I believe that's pounds. I could be wrong. Fine. Or maybe it's euros for building illegal facial recognition database. Oh, right. Because you know what this is? This is a, it's a dutch thing that. [00:59:56] Speaker B: Oh, yes. [00:59:57] Speaker A: So, so, okay. [00:59:58] Speaker B: Yeah, this is almost a deja news. [00:59:59] Speaker A: Because this was the same organization that was talking about. They were, it was Uber, right? [01:00:03] Speaker B: Yeah, Uber. [01:00:04] Speaker A: That was in violation and that they said. So it's the same organization. It's the dutch data protection Authority. They're imposing this fine against facial recognition firm Clearview AI in case she didn't know what they do because they're saying they violated the GDPR by building an illegal database with billions of photos of faces. And the idea is that basically, if you, if there's a picture of you that exists on the Internet, probably there is if you're watching this. Right. Realistically, unless you're an infant, probably there's a picture of you out there. Unfortunately, uh, that they basically, it's public, so they can scrape that and use that in their database. So if you go visit, I guess if you go visit the EU and you're a place where this is active, they can use this facial recognition technology and they'll know that it's you because they have your face in their database, even if I've never been there before. So in theory, I understand how this works, but it is a little scary and I can see why there's like a, whoa, whoa, whoa. You can't do that. But do, do you think legally they have a leg to stand on to say, like, you can't do this because it is public information? [01:01:02] Speaker B: It would seem that they probably do just out of the gate. I definitely get the hesitancy, the concern behind the fact that, well, we've recently opened up a new Pandora's box and we were not prepared for how that could be utilized because how do you prepare for something you don't know anything about? And now, so this is going to help kind of set those rules and boundaries on. While, yes, it might be public, but, and then there's GDPR that says, well, if it's an image of me, I own that, right. I get ownership over that in some capacity. And I can say, I want you to take it down. [01:01:45] Speaker A: Right. [01:01:46] Speaker B: I want you to remove it because it is mine, because it is about me. So metadata and information that is about a person, that person at the end of the day, owns in some capacity and has the rights over according to GDPR, if I'm interpreting it correctly, yeah. And that would be where they can say, if you're not giving everybody that you've scraped the ability to remove from their from your platform, then you're in violation of GDPR, right? I don't know if that's the argument they're making. [01:02:15] Speaker A: It says they inch insufficiently. Excuse me. Inform people who are in its database about how their data is used. So. People? A lot of people. I mean, it's a database of more than 50 billion photos of people's faces. So odds are a lot of people don't even realize that they're part of this database. I mean, how do you inform everybody? They do offer, at least in the US. Clearview offers some residents the option to the ability to access, delete and opt out of profiling. But only in six us states. [01:02:41] Speaker B: The government does not like competitors. [01:02:43] Speaker A: Right? Yeah, but it's only in some of. [01:02:46] Speaker B: This boils down to. [01:02:47] Speaker A: And of course the reasoning for this facial recognition system is. Oh no, it's be used for good. We're going to use it to identify suspects and persons of interest and victims we want to help solve and prevent crimes. Oh, that's great. I'm really happy that you have a noble cause behind what you're doing. However, I don't want to be in a database that I don't even know that I'm in. And I know there's only so much I can do to avoid that. I have willingly put my. My face and stuff out there on the Internet. I understand there's a risk that comes with that. Yeah, but this is across the ocean. This is like Dutch. What? Like, I don't even. Until a few weeks ago, I'd never even heard of this agency. [01:03:16] Speaker B: The Internet knows no boundaries. [01:03:18] Speaker A: That's true. [01:03:18] Speaker B: Right? [01:03:18] Speaker A: That is true. [01:03:19] Speaker B: I mean, technically, there are boundaries to the Internet, legally, but. [01:03:24] Speaker A: But you're right. [01:03:25] Speaker B: Worldwide, it doesn't really care. [01:03:27] Speaker A: Yeah. Yeah, that's true. So to me, this was interesting because it's like, well, it's public and so. [01:03:35] Speaker B: Technically, I wonder if they'll make the legal precedents of. Or I say precedents. They'll make the legal case that you made this public. You posted it publicly to be consumed by the public. I am a public. I consumed it. [01:03:51] Speaker A: I'd be curious to know if they. [01:03:52] Speaker B: How are like, what's. What avenue are they gonna take? [01:03:54] Speaker A: Because then, like, okay, so, like, if a brand uses my face. [01:03:57] Speaker B: Yeah. [01:03:57] Speaker A: And like, makes money off of it, I have an avenue to be like, you can't make money off my likeness. Right. I'd be curious to know if there's any argument like that that could be made for like, quite possible. Well, are you making money off of this product you're creating. That's a facial recognition software or whatever, and technically, you're using my face, and so give me a cut of that, or else you're not allowed to use my face like I would be. I know that's getting really convoluted, but I'd be curious to know if that kind of argument could be made, if at all. Anytime I see something now that has, like, AI in the headline and stuff like this, it pisses me off. [01:04:25] Speaker B: What'll be interesting is how this case ends up. [01:04:29] Speaker A: Yeah. [01:04:29] Speaker B: Right? So, because think of the scenarios. What is the possible scenarios? One scenario is that Clearview wins, right? And they get to continue scraping the Internet and build this massive database. [01:04:40] Speaker A: They pay no fine. Yeah. [01:04:41] Speaker B: And those databases get used by law enforcement and whomever else that they license it out to, and it's another day at the beach. I'm not saying that's a good thing or a bad thing. I'm just saying that is a possibility. Basically, they continue as if nothing ever happened. Second possibility, they lose the case. What happens to that data? [01:05:01] Speaker A: Yeah. [01:05:02] Speaker B: Do they just. They go out there with giant magnets and degauss everything and. [01:05:06] Speaker A: Yeah. And it's all lighted on fire. [01:05:08] Speaker B: Yeah. Burn it. Stick in a big chipper. [01:05:10] Speaker A: Get Billy to bring out some rags and kerosene. [01:05:12] Speaker B: Yeah. And go buck wild? Or does that stuff get confiscated and make its way into a government system where they use it for facial recognition? Right. I feel like that's about how it would go down. [01:05:26] Speaker A: Right? Like, I'm a conspiracy theorist. But it's not destroyed. [01:05:29] Speaker B: Yeah. [01:05:30] Speaker A: Yeah. [01:05:30] Speaker B: And what are they gonna do legal, you know, legally go after themselves? Oh, we're using this wrongly. We should stop. [01:05:36] Speaker A: Right? Or is it going to be like, well, it'd be a shame to let this database of 50 billion photos go to waste. [01:05:41] Speaker B: I mean, it's already here. What are you going to do, take. [01:05:42] Speaker A: It off your hands? [01:05:43] Speaker B: Yeah, we got to take care of it. You can. What? They say the scariest words you could ever hear are from the government, and I'm here to help. Yeah. That is true. [01:05:54] Speaker A: It's true every day. It's a little scary, so. Of course. Yeah. The wonderful world of AI. And I'm not. I'm not afraid at all. [01:06:01] Speaker B: Yeah. Not a fan. [01:06:02] Speaker A: One more thing concerning AI. We won't spend too much time on this. Uh, it's more of a. I'm curious. I'm curious to get your opinion on this. [01:06:09] Speaker B: Yeah. [01:06:09] Speaker A: Oprah. Forgot about her. Right. Oprah's upcoming AI television special sparks outrage among tech critics. And that's, that's a little bit of a, you know, a feeling infused headline. [01:06:17] Speaker B: There, but it's a bit of an op ed. [01:06:19] Speaker A: This is going to be, it's a 1 hour special that's airing September 12, sometime next week. Okay. And it's gonna, Oprah's gonna host it, and there are going to be AI experts, but not, I don't know that I'd call them experts, but Bill Gates. Sam Altman, experience in AI. Absolutely. For sure. I don't know that. I think AI, and I immediately think Bill Gates. I know he's obviously like a tech, a figure in the tech industry, obviously. But specifically with AI, I don't know. But this is a 1 hour show that's going to explore AI's impact on daily life and will feature interviews with these people. The thing to me that's interesting about this, I don't know anybody under the age of 30 that willingly watches Oprah or engages with her content. No shade to Oprah. I don't personally like her, but no shade as far as her show, I know it's still very popular. [01:07:03] Speaker B: It's not that Oprah doesn't know her demographics. [01:07:05] Speaker A: Sure. That's what I'm saying. The people of a certain age are not Oprah's demographic. They're not looking to watch her show. Right. Generally, she's not relevant to people my age. Usually, I would argue probably a good chunk of her audience is like baby boomers, Gen Xers. Right. That's probably millennials, maybe. [01:07:20] Speaker B: Yeah. [01:07:20] Speaker A: So that's probably our target audience for people on the higher end of that spectrum, age wise. Could be this is one of the first times they will hear something like this discussed in detail, and you attach a face like Oprah to it that, oh, I love Oprah. She's great. You get an AI. You get an AI, whatever. You know, to me that it gives me an icky feeling, I guess. And I can't really put my finger on why. [01:07:40] Speaker B: So I feel like, can you tell me if I'm on base when, if I hint around what you're feeling? I feel like that the problem here is a, she is presenting a very curated side, a single. And that seems to be the concern of the other, or I say the other, of AI experts. [01:08:01] Speaker A: Right. [01:08:01] Speaker B: Right. Is that you're only presenting a very, very polished and specific side of the use of AI and what you can do with it and can't do with it, how it's being used and so on and so forth. And it's being done through someone who has celebrity and influence. And the people that she's having on to give you that information are people that have celebrity influence. They're politicians or. This seems to be the game we've decided to play as. Because it's effective to influence people. Right. We live in the time of influencers. It's all about who you're listening to as far as an influencer goes. And if you can get a large voice to have other large voices come on and present something as true, as good as right, then it's generally accepted by the audience that listen to it that it is true, good and right. We don't really offer a lot of opposition to that because that's not what we want. [01:09:06] Speaker A: Yeah. [01:09:07] Speaker B: And correct me if I'm. That's what you're kind of getting the. [01:09:09] Speaker A: Feeling of, I think. Yeah, I think that's mostly, and this is not the first and only time that this happened. [01:09:15] Speaker B: Last time. [01:09:16] Speaker A: Some of those will happen. We see constantly there's either social issues or whatever things that are pushed by celebrities. And that's always gonna give me an icky feeling like, okay, like what? You're a pop singer. You're great. Love ya. Love your music. But why am I gonna listen to you about such and such issue? Right. So I think that's kind of what I'm feeling here. [01:09:31] Speaker B: I wanna listen to the foremost experts on certain issues. On certain issues. So show me credentials. [01:09:36] Speaker A: And even in this case, the experts that they're putting up there, Bill Gates and Sam Altman by name. Sam Altman's the CEO. I think maybe, I don't know. He's left and joined and left and joined. So who knows where he's at? So point being, he has a vested interest in presenting generative AI in a mostly positive light. [01:09:52] Speaker B: And I think that the people that are detractors are not saying we shouldn't hear from Sam Altman or Bill Gateshead or Oprah or whatever. It's that you are not showing a well rounded spectrum of arguments for this. And because of that, people are only going to get a single viewpoint. And that's going to be the viewpoint that they latch their hooks into when they start talking about AI. And that's the worry, is that if everybody doesn't come to the table as small or large as their viewpoint may be, as far as consensus goes, what they want to do is have that conversation and hear everybody's position so that people can make informed decisions on whether AI is good or bad or neutral. Or whatever. And that is the danger. I think that's what they're concerned about. And I would share that concern. [01:10:39] Speaker A: And one of the taglines for this is this special is going to feature some of the most important and powerful people in AI, which, interestingly, just because you're important and powerful doesn't necessarily mean that you're an expert. So I'm not saying that Sam Altman. [01:10:50] Speaker B: Doesn'T have some expertise in no way to their opinions. [01:10:55] Speaker A: You can have a lot of power and nothing. I don't know what the hell you're talking about. I'm not saying Sam Altman doesn't seem that. [01:10:58] Speaker B: But. Right. [01:10:59] Speaker A: I'm saying, like, I won't name names, but I'm just saying, like, that's not an uncommon thing. And it's, it's just, it's kind of sad, too, because I know there are a lot of misconceptions about artificial intelligence and certain ways that it can be used and it can be used for good like anything else. But if, but this kind of a thing, you have the opportunity then to introduce it in maybe more of a nonpartisan way and be like, hey, if you're watching this show, you're probably between this age and this age. Maybe you're not super familiar with this. Here's the good, the bad and the ugly. You decide what you think about this as opposed to AI is good. AI is here to help you. AI is your new savior. Not that that's what they're going to. [01:11:30] Speaker B: Say, but it doesn't matter what they say. Whatever they say, it doesn't have any, uh, kickback. It's, it's, it's meeting no opposition. [01:11:38] Speaker A: Right. Exactly. [01:11:39] Speaker B: That is the problem that the detractors of this are, of this thing that Oprah is doing, are saying, yeah, I'm. [01:11:47] Speaker A: Slowly going from, hey, you know, I think AI could do some good. I think it's got some positive uses to, well, maybe there are some use cases where it can be used for good, but I think it's something that we need to be careful about to very quickly, like, burn it all. I'm going to live in the woods. Like, I'm just quickly heading in that direction. [01:12:01] Speaker B: I am looking for some wooded property, I got to be honest. [01:12:04] Speaker A: Yeah, exactly. So, yeah, zillow up real quick. It'll be on ABC if you want to watch it and judge for yourself. Next week it'll be September 12, and it'll also be on Hulu the next day for streaming. So I kind of want to, I want to watch it to see exactly what it consists of and then come back. [01:12:22] Speaker B: I'm not saying don't watch it. I'm saying, like, cool. Okay. They presented one side. Right now, you, as the viewer, go out and find another side and see if that holds water. I do find myself that, because this is such a common thing, that's how I have to consume information. [01:12:37] Speaker A: Yeah. [01:12:37] Speaker B: Is to go, okay, these people say x. Now let's go find the people that say, why and that, or not x. Let me go find the not x people and go, okay, what do they say? And see the arguments back and forth and then make an informed decision. [01:12:54] Speaker A: Yeah, absolutely. [01:12:54] Speaker B: Yeah. [01:12:55] Speaker A: So I think I may try to set aside some time and watch this, because maybe I'm totally wrong and maybe the special is super informative, and I'm like, they were a lot more balanced than I thought they'd be. Something tells me, though, that's not gonna be the case. [01:13:04] Speaker B: Just doesn't seem to be how we do business nowadays. [01:13:06] Speaker A: So it's all my opinion. This is all opinion stuff. So I'm not saying that this is fact, but anyway, so that's all I had. I warned you I was gonna be. I was gonna be a little bit bitter today, so a little upset. A little upset. Yeah. I need to go have a sandwich and culminate. I have a PB and J in the fridge waiting for me. And then I'll feel better after that. Next time we come into the studio will be tomorrow. So I guess as the day this is being filmed, it'll be the day you're watching this. Thursday, September 5, 02:00 p.m. eastern Standard time. We will be here with Jax Scott for our all things cybersecurity webinar. It's gonna be a lot of fun. Bring your questions, and we will all do our best to answer them right here in this studio, live. We'll do it live right here on the channel, so make sure you tune in. We'd love for you to leave a comment like this episode if you enjoyed it and subscribe so you never miss an episode in the future. Next time we see you on Technato, the new iPhone will be out, as well as the new Ace attorney video game collection. I will be offering my opinions on both of those things. So, Daniel, any final thoughts? [01:14:01] Speaker B: No. You know, use the old noodle. Think, logic, reason. It's always a good thing. [01:14:07] Speaker A: Noodle, use your noodle. Great parting words. Thank you so much for joining us for this episode of Technado, and we will see you next week. Thanks for watching. If you enjoyed today's show, consider subscribing so you'll never miss a new episode.

Other Episodes

Episode

February 23, 2023 00:40:16
Episode Cover

Technado, Ep. 296: Windows 11 Is Now Available on M1/M2 Macs

Catch up on the cybersecurity and tech news of the week with Don, Dan, and Sophie as they cover the latest. This week in...

Listen

Episode

December 16, 2021 00:53:29
Episode Cover

Technado, Ep. 234: Coalfire's Jason Hicks

Jason Hicks, who recently joined Coalfire as a Field CISO and cybersecurity executive advisor, sat down with the Technado team to talk about his...

Listen

Episode 365

June 20, 2024 01:14:14
Episode Cover

365: Windows Wi-Fi Takeover Attack! (Update NOW!)

This week's Technado starts strong with some breaking news: Nvidia has surpassed Apple and Microsoft to become the most valuable company in the world!...

Listen