378: Oracle Wants its AI to Watch You...️️

Episode 378 September 19, 2024 01:15:46
378: Oracle Wants its AI to Watch You...️️
Technado
378: Oracle Wants its AI to Watch You...️️

Sep 19 2024 | 01:15:46

/

Show Notes

Oracle AI is championing mass surveillance, Microsoft and Cisco layoffs are in the thousands, and millions of D-Link routers are impacted by a critical vulnerability. Technado: Doomsday Edition starts now.

View Full Transcript

Episode Transcript

[00:00:04] Speaker A: You're listening to Technado. Welcome and thanks for joining us for this episode of Technado. Reminder that Technado is sponsored by ACI learning, the folks behind it pro. And you can use that discount code, Technado 30 for a discount on your. [00:00:18] Speaker B: Looking for an ACI logo somewhere. I think there's a cup over there. Cup up. [00:00:23] Speaker A: You can't really see. A lot of the stuff back here is just for, like, you know, ambiance. It's like, just filler. [00:00:28] Speaker B: Just. Yeah. [00:00:29] Speaker A: Can't really make out what a lot of it is. I mean, there's a lovely football, which is great. You know, we're very interested, are diverse here at Tech NATO. So we've got quite the slate of things today. It kind of runs the gamut between incredible, hard to believe type stuff. [00:00:45] Speaker B: Crazy. [00:00:45] Speaker A: Yeah, it just. [00:00:47] Speaker B: Oh, that sucks. [00:00:48] Speaker A: Yeah. Yeah, that's a new segment. Oh, that sucks. [00:00:51] Speaker B: Yeah, we're supposed to be a new. Oh, that sucks. [00:00:52] Speaker A: We'd be doing that every week. There's always. There's always something going on. That's just. You just. [00:00:56] Speaker B: Christian is hard at work right now behind the scenes, creating all that sucks. Hopefully trying to slide it into one of our sneakers. [00:01:02] Speaker A: It's just a sad emoji. [00:01:03] Speaker B: That's it. [00:01:04] Speaker A: So, yeah, it is gonna be an interesting one, but definitely a lot of good stuff to get into and a lot of interesting things to get into. So I'm ready when you are off. All right. We got our energy drinks. We're ready to go. [00:01:15] Speaker B: The ambrosia is frothy and good. [00:01:17] Speaker A: What flavors, then? [00:01:18] Speaker B: This is grape is ultraviolet, they call it. But that's really, really, really tasty. I find it to be the nectar of the gods. Honestly. [00:01:26] Speaker A: Interesting. I can never get into grape flavored stuff. I like grapes. I just don't. Everything else. Grape flavor tastes like medicine to me. [00:01:31] Speaker B: You didn't grow up poor. That's probably what's up. [00:01:34] Speaker A: Sure. Okay. It just tastes like cough syrup to me, so. Well, it does. It's anything. Great flavor. [00:01:40] Speaker B: My grandma used to just drink cough syrup. I don't understand. [00:01:44] Speaker A: I just don't like it. Maybe it's just a negative connotation. I don't know. It's the same with certain bubble gum flavor, things that taste like chewable Tylenol, and I just don't. I can't get into it, you know? I don't know. Whatever. Maybe. Yeah. [00:01:54] Speaker B: Okay. [00:01:54] Speaker A: Maybe I'm weird. Maybe I'm off. [00:01:56] Speaker B: Well, no one will dispute that fact. [00:01:58] Speaker A: We do like to start our technado with stuff that is just hot off the presses. And so we have one in store for you today. This is breaking news. Breaking news. Thank you. [00:02:11] Speaker B: You guys can't hear that? This is so funny. [00:02:13] Speaker A: He sounds so thrilled to be here. So excited. So this first one, I'm sure you've already heard about it. It has been pretty much every, every single website, tech news and otherwise. Hundreds of pagers exploded in Lebanon and Syria in a deadly attack. Here's what we know. Cis from security week. I think they may have pulled some stuff from Associated Press, but really, though. [00:02:32] Speaker B: That every news outlet is running some. Something about this story. Obviously, we are the tech, NATO, we're not the New York Times, the Washington Post, or anything like that. So the political side of things is not where we're going with this. It's the cyber side of that. [00:02:48] Speaker A: Yeah. [00:02:49] Speaker B: Caught our attention. [00:02:51] Speaker A: Cyber, or lack thereof, because, I mean, they're pagers. [00:02:55] Speaker B: Right. Well, the in ingenuity behind, like, pulling this off was very sophisticated because it. [00:03:01] Speaker A: Wasn'T like it was a hack or something. I mean, this was a. You talked about earlier. [00:03:06] Speaker B: I mean, like a supply chain thing probably is. There is some. Some technical aspects to this. So, ultimately, let's start with what, what we know. [00:03:15] Speaker A: Context. Yes. [00:03:16] Speaker B: Okay. So there. Hezbollah went to using pagers, I think, in February. They. They said, hey, don't use cell phones because cell phones can be tracked. And then Israel could. Could use that to track us, see where we're at. And, you know, they're. They're at war. So that's something they want to avoid. [00:03:33] Speaker A: Right. [00:03:35] Speaker B: Lo fi stuff has typically been very effective as far as, like, making communications obscured and kept out of the hands of your enemies because it's difficult to get it. It's super low tech. And that's why it has become, like, a very useful strategy. Or has it? Right. That's where we are here today is, okay, they moved to pagers, which is even more lo fi than probably, like, older cell phones, maybe, but they might have been using and probably were smartphones. So we don't want to get tracked. We'll move those pagers. And now you get a page, you drop to a pay phone or you use another, like, a burner phone or something and so on and so forth. I don't know exactly what their chain of events were. It doesn't go into that. Israel said, okay, I see what you're doing. I got you. And they intercepted their shipment of pagers from Taiwan and installed. They probably had to do something to the circuit board. To let it know, like, on when this page comes in. Right. That. And that's where the technology kind of comes in. But then they had to, like, they installed explosives, a small amount of explosive, into the device itself. Whether that was, I don't know if they turned the battery into an explosive or the explosive charge was separate from the battery, but the battery is obviously the spark that fires off the explosive device. There's a lot of. A lot of planning went into this. We got a supply chain attack. Right? How did they know that this specific order of pagers were for Hezbollah operatives? [00:05:18] Speaker A: Well, yeah. I mean, you've got to have an inside man somewhere, right? [00:05:21] Speaker B: Was it through some other means of technology? Again, it goes back to the sophistication of warfare that we have come into. We see Russia and Ukraine, and they're using drones. Right. We've seen a lot of them also warfare of them going after each other's utility systems and things of that nature. And now it's like, even if you use lo fi stuff, you're not safe. Right? Like, that is what is important for us to talk about when it comes to how cyber units are being utilized in warfare today. That's what's kind of scary is that even if you kind of drive, you're basically, we're going to start fighting each other with sharp sticks and, you know, out in the woods, because that's the only way to keep these. These things from being off limits when it comes to war, where. And basically use your communications as a weapon and not just like a normal cyber weapon, which is what we've typically seen. Hey, you don't have power. Therefore, you know, it's hard to have communications when nothing's powered up. [00:06:22] Speaker A: Yeah. [00:06:22] Speaker B: You know, you can't power up the satellites, you can't power up the grid. That needs to run the communications channels. Okay. Yeah, that makes sense. You've disrupted communications, but to turn their communications into an actual physical, kinetic weapon. [00:06:34] Speaker A: Yeah. [00:06:34] Speaker B: That's crazy. [00:06:36] Speaker A: Yeah. It obviously, like you've said, required a lot of thoughts very clearly, very well thought out. And somebody, most of the people that contributed to this article that spoke on it spoke on the condition of anonymity, given the circumstances surrounding all this. But there was an officer, I guess that basically was like, well, look, a pager already has a lot of the components that you would need, right? He lists it off. You need a container, a battery, a triggering device, a detonator and an explosive charge. All you'd really need to add would be the detonator and the charge and you've got yourself a homemade bomb. [00:07:03] Speaker B: That's right. [00:07:03] Speaker A: Which is. It's insane that, like, who thought of this, right? Who had the spark that was like, hey, what if we try this? [00:07:09] Speaker B: And it's a camouflaged hand grenade. Yeah. That you trust. [00:07:12] Speaker A: That you willingly take with you everywhere. [00:07:14] Speaker B: Yeah. [00:07:15] Speaker A: That you have to take with you everywhere in that situation. So it will be interesting to see. Cause now, at this point, do you trust any device? And, you know how we talk a lot of times about how, like, you can have, you know, the latest security and latest updates and whatever and patch everything the best that you can, but, you know, a lot of times, you're gonna have the other side of things, where people are always one step ahead, and so you constantly have to be. And it feels like this is no different. They're thinking that, oh, by going to pagers, by not using smartphones, we're avoiding some of these issues, and a whole other issue crops up that you didn't even well. [00:07:42] Speaker B: And it just goes to show you the fact that this is. A lot of. This is always going to be some sort of cat mouse camp. [00:07:48] Speaker A: Yeah. [00:07:48] Speaker B: Right. That. Okay, here's the problem that we have. We need to solve that problem. Okay, well, we'll do this, because that circumvents all those things that they can do. And then the other side goes, okay, I see your move. Let me try this. And then we go back and forth. So it's just a very interesting thing. I mean, unit 8200 was probably sitting around. Obviously, this wasn't an overnight, like, oh, I just happened to have this pager idea, and sitting in my back pocket, they went, okay. They. And they. They went to pagers. Let's adjust. [00:08:22] Speaker A: Yeah. [00:08:22] Speaker B: And then I'm almost. I guarantee this is very demoralizing, right. To know, like, we did something we thought were. Was pro. I say we. You know, if. If you're a part of Hezbollah, Hezbollah is probably sitting around going, we really dealt them a blood. They're not able to track us, and then they turn those into hand grenades. That's. That's gonna. That's gonna definitely psychologically. [00:08:42] Speaker A: Right. I'd be second guessing everything, right? [00:08:44] Speaker B: Have you. Have you seen the footage of, like, not. Not of this, but, like, the. The drone bombs that they drop them on tanks and stuff? Like, they'll. They'll have a grenade on a drone and drop it on a tank, because, you know, Russia has lots of tanks and stuff. This is getting really crazy that we are using technology in this way, and that's really what I want to focus on is the technology and how simple, mundane items that we thought were just for fun or convenience have now become a weapon. [00:09:13] Speaker A: And it was something that. Yeah, it's not like, oh, it was just a bomb disguised as a pager. It was a functioning patriot that worked all day for three to six months, worked perfectly. No reason to suspect anything. And it was an error message sent. All the devices that it looks like is what triggered it. That's what it appeared to be. Um, so, yeah, I mean, I. A lot of pagers said didn't go off, so that's how they were able to inspect and kind of figure out what's going on in here. Um, but, yeah, like you said, large scale, long time operation sophisticated attack. Uh, yeah. I don't know. This is just. I was like, wow. It left me speechless. I'm like, that's crazy to me. [00:09:48] Speaker B: It's just so insane that when cybereze attacks become kinetic attacks, when you. When you bridge the gap. Yeah, that scares the crap out of me. [00:09:58] Speaker A: Yeah. [00:09:58] Speaker B: That's just an individual. [00:09:59] Speaker A: Yeah. Just because anywhere, regardless of where you're at in the world and regardless of. [00:10:03] Speaker B: Get away from me, devil. [00:10:05] Speaker A: Yeah. You can't. I mean, unless you just. [00:10:08] Speaker B: Right. [00:10:08] Speaker A: You become technology. Yeah. What are you supposed to do? [00:10:11] Speaker B: You're gonna be in the woods in Montana. [00:10:13] Speaker A: Yeah. Yeah. I'm gonna go live in the woods. I'm gonna go build a cabin. And that's it. [00:10:20] Speaker B: That's it. I'm checking. [00:10:21] Speaker A: Will not hear from me. [00:10:22] Speaker B: Those prepper people had it right the whole time. [00:10:26] Speaker A: Yeah. So anyway, I wanted to knock that one out first because that article did. That was here within the last several hours that that specific article came out. And so that was our breaking news for today. [00:10:37] Speaker B: Crazy. [00:10:37] Speaker A: But, of course, we do have plenty of other stuff on the docket for today. This next one comes to us from bleeping computer. TfL requires in person password resets for 30,000 employees after hack. And TfL is transportation for London or something like that. It's. It's a London transport. [00:10:51] Speaker B: Transport for London. [00:10:52] Speaker A: Transport for London. Yeah. So all their staff. So that's the 30,000 people their employees have to. [00:10:56] Speaker B: A lot of staff. [00:10:57] Speaker A: It's a lot of stuff. Yeah. I mean, I guess London's a big city, right? [00:11:00] Speaker B: I wonder how many people work for, like, meta. Yeah, I'm gonna look that up. [00:11:04] Speaker A: You do that. We'll wait to hear. But these employees are not gonna have to attend in person appointments to verify their identities and reset passwords following a cybersecurity incident, disclosed almost two weeks ago. So the. The reason that this kind of stuck out to me is because it's like, oh, there's a cyber security incident, and, you know, you got to reset your passwords, you got to redo your two fa and all this stuff, or change your email, but this is like, it's not good enough. You got to come in so we can physically see that you are you, and then we'll take care of resetting the passwords and stuff. Kind of DMV. [00:11:34] Speaker B: From preliminary facts, I'm seeing, like, meta as of 2020, December of 2023, meta platforms had 67,317 full time employees, which apparently had shrunk from 2022 from. [00:11:50] Speaker A: But meta, that was the same, I think. Are they just in the US? [00:11:53] Speaker B: Are they global employees? I'm sure they're a global company. [00:11:56] Speaker A: So, I mean, you figure that's headquartered. [00:11:57] Speaker B: In the United States, but it's a little, they have branches everywhere. [00:12:00] Speaker A: It's a little more than double the amount we're talking about here. But this is just the city of London. [00:12:04] Speaker B: Just for City of London, you got 30 grand employees. 30K. [00:12:08] Speaker A: That's impressive. [00:12:08] Speaker B: That's a bit man. [00:12:09] Speaker A: And of course, in this situation, that means it is going to take quite a long time, probably an understatement for this to completely be resolved. If every single employee's gonna have to go and do an in person appointment to, you know, to verify their identity and all this stuff. [00:12:21] Speaker B: Did they say why they had to do it in person? [00:12:24] Speaker A: So I don't. I know it was for identity verification. So I guess maybe just because you can't is not good enough to be like, just submit a picture of your license or whatever. [00:12:35] Speaker B: They're so afraid that this isn't really, like, solved. [00:12:39] Speaker A: Yeah, yeah. [00:12:41] Speaker B: That they don't trust that we want to verify that, yes, you are the person, you changed your password, and you're done. Cool. Next, because that line sitting out the. [00:12:52] Speaker A: Door, the attack that happened a couple of weeks ago, that this is not following. It was employee directory data, email addresses, job titles, employee numbers. So not sensitive stuff like dates of birth, but all that stuff that could be sensitive was accessed. So I guess maybe now it's like, okay, well, somebody could be pretending to be you or whatever with your name and job title and employee number. And so we got to have you come in so you can reset this stuff. We got to make sure that you are you. Um, because this is the transport for London. It's not like, I mean, this would be obviously catastrophic for any company but trans. I wonder how this will impact just people looking to travel and get around in London. If it's going to be like, this is going to slow our services a. [00:13:28] Speaker B: Little bit because you can't. I mean, how. I wonder how often. Maybe it's not as bad as what we think, because in my mind, I'm envisioning, you know, just lines of employees and somebody going next. Come on in. Okay, change your password. Do the thing. I don't know why. They've got a New York accent there in London. Okay, change your password. Right. Okay, that's a good password. Let's move on. But probably in reality, what's happening is that most employees end up going to work at some point in time, and while they're there, their boss is like, hey, don't forget to change your password. [00:14:07] Speaker A: Yeah. [00:14:08] Speaker B: And that's how it goes. [00:14:09] Speaker A: Yeah, yeah, that's probably true. So hope that they can get through that sooner rather than later. That really sucks. We feel for you. And this kind of segues us into our next thing as well, a segment that we haven't had in a hot minute. But, yeah, we're gonna do a little behind bars today. [00:14:32] Speaker B: Break the law and you'll go to jail. [00:14:35] Speaker A: He's busting a move. [00:14:37] Speaker B: You know, he says he'll go to. [00:14:38] Speaker A: Jail, but probably, probably you'll go to jail. [00:14:43] Speaker B: Hopefully you'll go to jail. [00:14:46] Speaker A: So somebody was arrested in connection with this cyber attack. 17 year old kid, affected the transport of London so severely. I don't know what I was doing when I was 17. Nothing like this. [00:14:56] Speaker B: Yeah. [00:14:57] Speaker A: I don't even know whether to be, like, mad or impressive. [00:14:59] Speaker B: I'm trying to think here. When I was 17, so I'm a high school dropout, so I was out of school. I was probably skateboarding a lot. [00:15:07] Speaker A: Yeah. [00:15:07] Speaker B: And working at a grocery store or something. [00:15:09] Speaker A: But you weren't in jail. [00:15:10] Speaker B: I was not in jail. I mean, I'm not saying I didn't do things that could have probably landed me there, but I wasn't there. [00:15:16] Speaker A: Break the law, you might end up in jail. [00:15:18] Speaker B: Yeah. [00:15:19] Speaker A: So british authorities on Thursday announced the arrest of this 17 year old male. Of course, because he's a minor. I don't think they're gonna disclose his name or anything like that in connection with a cyber attack affecting transport for London detained on suspicion of computer misuse act offenses in relation to the attack. So he's from Walsall. Walzel. I don't know how you'd pronounce that if you were a british person. Let us know. Is said to have been arrested on the fifth. Following an investigation, he was questioned and let go on bail. But it is. I mean, they do think that this. I mean, this is the guy. So it's. He's been released on bail. It will remain to be seen. I guess what kind of time he actually serves for this. You know, a lot of times it's like this stuff comes up two, three times, like, oh, remember that thing we talked about a month ago? Yeah. He's now in court. Remember that thing we talked about? Yeah. He's gonna serve ten years or whatever. [00:16:04] Speaker B: Or Bob's your uncle. He's in prison. [00:16:07] Speaker A: You gotta stay in a british accent. Bob's your uncle and he's your aunt. Yeah, yeah, yeah. So pretty. [00:16:13] Speaker B: Your mother's brother. [00:16:15] Speaker A: I don't think I've heard that one. [00:16:16] Speaker B: Yeah. Robert's your mother's brother. That's a fancy way of saying Bob's your uncle. [00:16:19] Speaker A: Oh, yeah. Oh, learn something new every day. But the attack was launched on September 1, and it was a couple days later that they got him. Pretty quick moving. [00:16:30] Speaker B: Pretty quick response indeed. [00:16:31] Speaker A: Props. Good for you. [00:16:32] Speaker B: Yeah, good for them. I mean, I doubt there's many, if any, organization out there that is immune to a cybersecurity breach of any kind. There's some way in, you just. They just haven't done it yet. [00:16:45] Speaker A: Yeah. [00:16:45] Speaker B: Right. So. But a quick response is always a telltale sign of a mature security policy. [00:16:51] Speaker A: So, yeah, I will be curious to see what kind of repercussions come from this or what. Cause, you know, depending on the crime, it's like, oh, okay, you know, serve your 63 months or whatever random number. [00:17:02] Speaker B: But I wonder what the default amount of time for. [00:17:06] Speaker A: Yeah, well, I guess maybe the. It depends on the impact of the attack. [00:17:10] Speaker B: Probably has something to do with it. [00:17:11] Speaker A: Because it does say, there's not been a lot of impact on customers so far. [00:17:14] Speaker B: But we have maximum sentences as well. [00:17:16] Speaker A: Right? [00:17:16] Speaker B: It doesn't like. [00:17:17] Speaker A: But the breach did lead to unauthorized access of bank account numbers. And that to me is like, that's not just like, oh, it was a list of emails. Not that that's not good, but not that that's bad or not bad. [00:17:28] Speaker B: Is the bank now changing 33,000? [00:17:30] Speaker A: Whatever it is, all you customers have to come into the bank. [00:17:34] Speaker B: You have a new bank account number. [00:17:35] Speaker A: That would suck to have to go into the bank to do that because it's like, oh, we're only open 10:00 a.m. to 04:00 p.m. and then sometimes we're not open at lunch and then, you know, on the weekends we're not open. And sometimes on Fridays we close early. So it would just. It would suck to have to deal with that. So hopefully that's not the case. [00:17:49] Speaker B: That's because we tie so many things to our bank accounts now. Yeah, right. And now I'm. Now I'm like, well, you know, I could probably pull up some. Probably derive or invent some system like DNS, but for bank accounts, right. Your name and your bank account. But then it's like, I feel like that's probably gonna open a security issue. Yeah. If that gets exposed, it's like, yeah, yeah. You see that convenience is the antithesis of security. [00:18:16] Speaker A: Interesting note here. The police previously arrested a 17 year old boy, also from Walsall, in July 2024 in connection with that MGM attack. [00:18:26] Speaker B: Oh, really? [00:18:27] Speaker A: It's unclear if it's the same kid because it's this. I mean, that kid is not 17 anymore, I guess. [00:18:33] Speaker B: No, he would not be 17. [00:18:34] Speaker A: He'd be an adult now. But, I mean, that would be not funny. It's not funny. But how odd would it be if it's like, this dude's been busy, this kid has been. [00:18:42] Speaker B: Oh, no, he could still be. It says. It's. It's worth noting that West Midlands police previously arrested a 17 year old boy from Walsall in July 2024. So it's been a full time, easily still be. [00:18:55] Speaker A: If he was 17 in July 2024, he'd be 18. And. Oh, I'm so stupid. [00:18:59] Speaker B: Yes. [00:18:59] Speaker A: I'm thinking that's last year. I'm like, no, time's not real, y'all. This is. I just cracked this open. [00:19:06] Speaker B: Glitch in the matrix right here. [00:19:08] Speaker A: You saw my eyes, like, cross for a second and I. My brain reset. [00:19:11] Speaker B: Yeah, that was funny. I'm like, what the hell are you talking about? [00:19:14] Speaker A: Are you good? Having a stroke right now. [00:19:16] Speaker B: Do you know when his birthday is? He happened to have crossed it good. [00:19:20] Speaker A: I'm thinking that's like, last year. No, that was just in July. That's crazy. Feels like it was longer ago than that. [00:19:25] Speaker B: You gotta get the diesel field to. [00:19:26] Speaker A: Kick in before you need to, like, I gotta go hard, reset my brain. But yes, you're right. It very. Could very easily could be the same guy. I feel like you'd have to be pretty bold to have just been arrested for this thing and then to immediately be like, yeah, but I'm kind of bored. Yeah. So it's just, you know, just carry out a little tack this is more local. [00:19:42] Speaker B: He didn't learn his lesson. [00:19:43] Speaker A: He did not. Apparently, a little more local. [00:19:45] Speaker B: What happens when we don't. Yeah, there's no discipline, Swift. [00:19:49] Speaker A: That's right. So that was a little bit of a double whammy there with those employees having to go in and reset that stuff. And then we had a little behind bars as well. [00:19:57] Speaker B: It was fun. [00:19:58] Speaker A: It was fun, wasn't it? Yeah. Thank you for hard resetting. [00:20:01] Speaker B: My brain was watching Sophia. [00:20:03] Speaker A: I just know that's gonna be a YouTube short in like a month. That's gonna pop up. [00:20:06] Speaker B: She hit a loop in the program. That was not right. [00:20:10] Speaker A: There's gonna be a clip of me with, like, the dial up modem sound. And I'm just gonna be like this. [00:20:17] Speaker B: Well, good times. [00:20:19] Speaker A: I'll probably let Daniel take the lead on this next one so I have a chance to reset myself. Some critical vulnerabilities are impacting millions of d link routers. Patch now. Exclamation point. So this is not the first time in the last. Even in the last month that we've talked about d link vulnerabilities. But I think the last time we talked about it, it was routers that had already been end of life. Like they were nothing being serviced anymore or being provided updates. In this case, though, these are routers currently. That are currently supported, right? [00:20:44] Speaker B: Yep. And that's what's up. This. When they say millions, they could mean you, right? [00:20:49] Speaker A: Talk to your doctor. [00:20:49] Speaker B: D link is a very popular manufacturer of Soho routers and stuff. So you very well may have one of these things sitting on your desk. You know me, I like to go, hey, if. If this affects you, you should probably do something about it because you don't want to be affected by these problems. And this does happen from time to time. Good news is there's a patch. And what's really interesting. So. Well, hey, let's talk about. Let's talk about the vulnerabilities. Let's jump into that. Right. So we have a stack based buffer overflow for the first vulnerability. It affects the DirX 5460 A one s and the D R x 4860 A one s. So if you got one of those, you might have this. We got a CVSS score of 9.8. Right. Because we have the impact. Unauthenticated remote attackers can exploit this vulnerability to execute arbitrary code on affected devices. We call that a dumpster fire. [00:21:51] Speaker A: We talked about arbitrary code the other day. That's like personal wind code, right? [00:21:53] Speaker B: Yes, that's right. Whatever I feel I want to do. [00:21:57] Speaker A: It'Ll do what I say it shall be. [00:21:59] Speaker B: And the fact that it's unauthenticated in a remote. This is why it's 9.8 on the, on the Richter scale. So definitely update. You have that solution right here. D link has released firmware updates for those routers that will take care of this vulnerability. We also have another one OS command injection. This has a score of 8.8. So obviously this is a little bit different as far as its probably its complexity or how, you know, maybe you have to have authentication or something like that already taken care of. But it does say attackers can use hard coded credentials. There we go. To log into. So if you still have those hard coded creds on your d link device, which is the 84 60. Yeah, this is for the 8460, a one they are going to log in and then they have the ability to define that specific sweet spot, allow them to do arbitrary os commands. So very, very sucky for that. What else do we have here? We got another 9.8 for this one. Reveals hidden functionality and certain d link routers where the telnet service is enabled when the wan port is plugged into. Well that's fun. They probably went, oh yeah, if you're accessing this over the Internet. Telnet. No, no, this is where I roll up my newspaper at d Link. I go, what are you saying? You look what you did. You see what you did? No, that's a bad d link. Bad. You don't do that. Uh, cause yeah, I mean, I'm sorry, right? I gotta throw my hands in the air. I gotta throw my hands in the air and say, listen, d link, I mean for like, I've been in the business for like 20 years now, over, been in the business over 20 years. My first tech job I got in 1999. Okay, wow. And right around that area was when they started going, yeah, tell net, bad, ssh, good, let's do ssh instead. Right? It wasn't long after I got into the field. So let's say for 15 years, ssh has been kind of been like, hey, we should prioritize using ssh. What are you doing turning on Telnet? I'm not saying like, well, you know, ssh wasn't a, we couldn't do this, we had to give him something. Not telnet, not telnet. Okay. That's not what we do here. [00:24:20] Speaker A: Like I've been hearing that ever since I started working here when I didn't know anything. That was one of the first things that I heard come up was like, if you're using telnet. You are asking, you're asking. [00:24:27] Speaker B: You're basically asking for it. [00:24:29] Speaker A: Yeah. [00:24:29] Speaker B: Like crazy people. Okay. [00:24:32] Speaker A: Yeah, yeah. [00:24:33] Speaker B: But again, that, that would be bad. What else we got here? Another stack based buffer overflow with a 9.8 going after that. That 4860 is just a couch fire. It's every one of these. Is that 4860? What else? The vulnerability includes unauthenticated remote attackers execute our barrier cherry close. We got two of those on the same thing. We got another hidden functionality. So this is an 8.8. Not too bad. There he is. Attackers can enable telnet services by sending specific packets to the web service and then logging in using hard coded credentials. So again, here we are with my hands in the air. Going hard coded creds is also long for a while now. Been kind of a no no. Where's my paper? I do with my paper. Get over here. You come here. You make me come over there. [00:25:28] Speaker A: Cause it sounds so obvious and it, anybody that is doing things like that, hardcoding, crazy stuff, you know, that, like on a basic level, they know that's not a good thing to do. And probably what they're thinking is just like, oh, well, it's just for a second. I'm just gonna, just for a minute. I'm just going to leave this here and I'll remember to come back later and da da da. So it's just, you just can't do that. [00:25:46] Speaker B: Nope. You just can't do that. [00:25:47] Speaker A: You can't take the shortcut. You got to be on top of that stuff. [00:25:52] Speaker B: For us as the end user, obviously, we've got a firmware update to do. [00:25:56] Speaker A: Right. [00:25:56] Speaker B: And I think this is something that is highlighted. A lot of people probably forget about their access point, their wireless router sitting over there on the desk doing its thing, humming away, giving you that Internet access that you love so much. When's the last time you updated your firmware? Right, right. [00:26:15] Speaker A: You don't really think about it. [00:26:16] Speaker B: You don't really think about it. But it needs to become a part of like, you know, quarterly. Look and see. Is there an update for my firmware? And if so, let me just apply that thing, log in the web interface, do the little, hey, search for updates is one available update now. Simple, right? But if you don't do it and you run into d link or whatever, it's got no vulnerabilities. That's on you. I do believe that a lot of, like, I think my home router does it automatically. Like, I don't have to touch it. It just goes. I. It's. It's using like, um, webhooks or whatever to look for updates and when they're available. Yeah, it just updates. [00:26:57] Speaker A: I think if you. If you can set things up to do that kind of thing automatically, in that case, I think it's the smarter option to do that. [00:27:03] Speaker B: You could probably write a script that does it. [00:27:05] Speaker A: Yeah, yeah, yeah. So if you have the know how. [00:27:07] Speaker B: And then you just spend a little time. Spend an afternoon writing a script that grabs the right stuff. I mean, I guess it would be difficult. It's probably possible, though. Let me put it that way. It's possible. I don't know if it's probable for you and your specific devices, but it's like, sure. [00:27:22] Speaker A: And maybe the juice isn't worth the squeeze for some people to do that. It's. But I do think, you know, in that case, you. You somewhat prevent stuff like this from happening, or you avoid the consequences of stuff like this from happening, because it's like, oh, shoot, patch now up. Well, mine updates automatically, so I'm good. If there's an update, I've got it. Um, so if you've got a dealing router, just something to keep in mind, I guess, because there are patches. So there's a solution. Hope is not lost. Yeah, we do have a cover. This next one, before we go to a break, I think we got next one. It's the Microsoft and Cisco stuff with the layoffs. [00:27:53] Speaker B: Oh, that's. That's tough. [00:27:55] Speaker A: Spoiler alert. Yeah, it's a bit of a double feature. [00:27:57] Speaker B: Yeah. Well, let's. We'll leave you with a. Yeah. Cliffhanger. Cliffhanger when we come back. [00:28:02] Speaker A: Up next on Technato. So don't go away. We're gonna go to a short break and I will reset my brain. But when we come back, more on Technado. [00:28:11] Speaker B: There's a new CCNA in town, and. [00:28:13] Speaker A: Here at ACI learning, we've got you. [00:28:15] Speaker B: Covered with a brand new CCNA version. [00:28:20] Speaker A: This course covers the theory that you need to succeed as well as the practical, hands on application of technologies. [00:28:31] Speaker B: You're going to learn network fundamentals, network access technologies, IP connectivity, IP services. Don't waste any more time. Get signed up for the new CCNA. [00:28:45] Speaker A: Here at ACI learning. Welcome back. Thanks so much for sticking with us through that break. As promised, we are going to talk about some layoffs, which I guess is not, you know, it's not the happiest thing in the world to talk about, just not. But we promise we talk about it. So we'll. [00:29:04] Speaker B: Newsworthy though. [00:29:05] Speaker A: It is newsworthy. And it's a little bit of a double feature here because, you know, Microsoft's not the only company that's been doing this, but we'll start with Microsoft. Microsoft laid off another 650 staff specifically from its video game workforce. Excuse me. And Xbox boss Phil Spencer of course sent a memo to the staff talking about it because of course, you know, we're very sorry to have to do this. You know, we understand we're gonna try to help you through this, but a lot of people were like, oh, this is probably related to that acquisition. Activision acquisition. [00:29:33] Speaker B: That's fun. [00:29:33] Speaker A: That sucks. [00:29:34] Speaker B: Yeah, Activision acquisition, that's three times fast. [00:29:37] Speaker A: The merger, if you will, that happened last year. Uh, but that's not just a theory. He came out and said in the memo that hey, following this acquisition, we've been trying to acclimate our teams and hey, we're gonna be laying off some more people. Uh, so that means in total over 2500 staff, uh, have been laid off since that acquisition happened last year, uh, 2023. So not fun, obviously for a lot of the Xbox gaming staff. Um, but also just interesting that, I mean, 2500 people, I wonder how many people in total they've got employed in their gaming division now. Because to lose 25,500 in a year. [00:30:11] Speaker B: And to make the statement that no games are being canceled, nothing in, in flight right now, and we're going to hit our deadlines and blah, blah, blah. [00:30:20] Speaker A: Like, no studios are shutting down. [00:30:21] Speaker B: You're saying you were just a glut of developers and, you know, support staff for no good reason. And yeah, they, they got to hit the fire because here's a fun fact, kids. If you work for a company and that company gets acquired or merges with another company, be prepared because things are going to happen. I don't care what they say, those are called wait for lies. It's just the nature of that function of business is that when they get together and become one new company, there's gonna be some people that suffer and lose their jobs because of it, because they'll start changing how things. And I've heard it, I've heard it a million times, right? If we get acquired, that's because we're awesome. They wouldn't change what we're doing. That's why they want us. Why would they change anything? Because that's how things go. [00:31:23] Speaker A: I wonder if like, you know, way, way back 100 years ago when there was a lot of fighting over land going on. And like, you know, we're gonna claim this land for Spain and whatever. I wonder if that's how people felt like, well, they wouldn't want to claim this land if it wasn't awesome. They didn't like what we were doing then it's like, no, sorry. This is how we're doing things now. [00:31:43] Speaker B: Yeah, normally what it is is there's something about what you're doing that they want. Not necessarily. They don't necessarily want you. They want the thing. They want access to the thing. So to your analogy, land, if I'm. I'm, you know, country a, and I see country b has really fertile land for development and resources. Maybe there's gold or whatever. Hey, we could use that. Let's go kick their butt and take it. [00:32:07] Speaker A: Yeah. [00:32:08] Speaker B: Right now we just do that with the power of money and just go, wow, they've got that cool thing. Let's just take it with money. Let's just offer them so much money, they won't say no. Yeah, and then they don't. And then you'll start to see where. Oh, the old CEO of that company, which now has merged with such and such, they're gone. They're. They took their payout and was like, this was fun, deuces. [00:32:29] Speaker A: Cause you see some. Maybe some positions stay, or, you know, the position's the same, but the title changes. Or you see, maybe it's like, okay, well, we don't really need you in this position anymore, but if you want to try, you know, we have this other position that we're creating, and if you want to apply for that, but if you don't, good luck and, you know, have fun in your next job. Right. [00:32:47] Speaker B: Just go watch Silicon Valley. [00:32:48] Speaker A: Yeah. [00:32:49] Speaker B: Yeah. [00:32:49] Speaker A: There you go. This is not new. [00:32:50] Speaker B: It's like the playbook. [00:32:51] Speaker A: This has been going on for forever, so. [00:32:53] Speaker B: And then you can see, oh, this is how this works. It is a. There's a reason that show was so funny is because it was lampooning all these stereotypes that were true about the tech industry. [00:33:04] Speaker A: It's funny cause it's relatable. [00:33:05] Speaker B: It's funny because it's relatable. [00:33:06] Speaker A: A lot of people. [00:33:07] Speaker B: Yeah, that's how that goes. [00:33:08] Speaker A: So I found something that. This is the Jordan Times, I think, which is very specific, but this is. I was trying to find how many people they employed specifically in the gaming division. So as of January, when they initially started talking about, hey, we're gonna be laying people off. They employed 22,000 people at the time in the gaming. [00:33:24] Speaker B: In the gaming division in the gaming. [00:33:25] Speaker A: Business, which included the Xbox division. [00:33:27] Speaker B: So looks like these cuts follow the already eye watering 1900 layoffs that Microsoft made to its gaming business earlier this year. So now we're, we're over the cusp of 4000 people without a job since earlier. [00:33:41] Speaker A: Yes. Yes. So. So I think in total the latest layoffs mean Microsoft has let go of 2550 staff. It's a little earlier in the article from its gaming business since acquiring Activision Blizzard, which is in 2023. So that 1900 is part of that bigger number, I think. [00:33:56] Speaker B: Yeah. [00:33:57] Speaker A: So that's. [00:33:59] Speaker B: Here's what I feel about this. Go ahead, I'm sorry. Finish your thought. [00:34:01] Speaker A: No, I was just gonna say that's, I think the grand total that they, that they provided for now. [00:34:05] Speaker B: So what this tells me is there's a bunch of developer game developers that are out there without a job right now. I feel like maybe I'm wrong and obviously it's not fully big. I'm just hoping off the dome. Off the dome. Right. Let it reflect and you cook on it. If I was a game developer, I think this is a great time to make my own games. Right. Self published games, especially with like Steam and all these like Android gaming systems and stuff. It is the, it's really, Sky's the limit. You can go do your own thing. And the more developers that do that and go, you know what? I'm going to start a small business and I'm going to make my own games. And you know, maybe we're a one or two person shopper for a long time, but we make really cool, fun, quality games. That starts to, it makes it to where if I want to be a game developer, I don't have to go to Microsoft. I don't have to go to Blizzard. I don't have to go to Epic. I don't have to go to them to be a game developer that can make a living and impact the industry with good stuff. And the more of you guys that do when there's 4000 of you apparently, excuse me, 4000 of you out there that are throwing resumes out at this point in time. Maybe some of them already rebounded and got jobs. Hope they have. Yeah, but think about it. Maybe. Maybe just start doing your own thing. I get it. That's easier said than done. Scary as hell. [00:35:32] Speaker A: Sure. [00:35:32] Speaker B: Right to go. Yeah. I've gotta, I've still got to pay my bills until that happens, until I'm making enough money off my own stuff. But start working on that so that this doesn't happen to you again, that you if you're ramping up on your free time, this new business model for yourself, right. I think that would be cool. I really like independent games. [00:35:55] Speaker A: Yeah. [00:35:56] Speaker B: I really like, I mean, don't get me wrong, the cool big houses that are putting out all the. All the best stuff. Yeah. Yeah. I'm not gonna lie. Pretty cool. I don't really, I don't have a system or anything for me to take advantage of them, but I can recognize the fact that they are very, very cool games and that people really like them. But I really like to look and find those weird oddity games that are out there and go, oh, download this on my smartphone and play it. [00:36:23] Speaker A: Yeah, I agree. And I think you're right in that the one of the few silver linings that comes from stuff like this is that I think you do start to see indie games. Indie studios maybe crop up after big layoffs like this. I mean, you look at games like, I just literally just googled best indie games. Cuphead, Stardew Valley. Cuphead star. Yeah, Stardew Valley. There was another one, terraria. Like, you look at these games that are like you say, stardew Valley. Most people know what you're talking about. Cuphead was huge and still is. Undertale huge and still is. It was like a cultural reset for people at my high school. It was like, oh my God, this is a big deal. So like, people were, people were like cosplaying sans in the halls. It was great. But anyway, so point being that it's just because it's an indie game doesn't mean that it can't be. Have that same kind of impact that a bigger. I mean, yeah, I'm always down for a round of Call of Duty. Yes, I will always enjoy the Mario and Luigi games and has a fun time. But, but, but yeah, I think the. [00:37:11] Speaker B: Indie game, that doesn't mean we don't make new stuff either. Right, exactly. [00:37:14] Speaker A: And maybe a little more creative freedom. [00:37:16] Speaker B: Right? And I'll tell you right now, tell me that's not appealing. If you are a game designer and you got nobody telling, telling you, no telling you, you can't do that. You can't make it that way. You have to deliver this by this date and time or you're sanctioned. You know, you're not hitting your KPI's. You know, you get to, you get to shirk off all that corporate culture and just go, I'm gonna make something that I think is fun and I think other people will think is fun. [00:37:41] Speaker A: Yeah. [00:37:42] Speaker B: Right. [00:37:43] Speaker A: I think the. It's the same with, like, YouTube videos. How you said, like when I make a YouTube video where I'm not trying to get views or do anything specific, those are the ones that tend to pop off. I feel like sometimes that kind of seems to be the trend with game when you're just making a game because it's like, I wanna make something fun that I'll enjoy and other people enjoy and you're not worried about. I gotta make sure that I'm gonna meet this deadline or I've gotta make sure I'm gonna make this amount of money. Those sometimes end up being the games that are like Superstar games. They're just like sleeper picks for games. Anyway, so we said this is gonna be a double feature. [00:38:12] Speaker B: You know, it started as kind of a down smile with the layoffs, but then I think we added a little uptick to it. [00:38:17] Speaker A: We need like some music in the background that swells a little bit. [00:38:20] Speaker B: It's the. What was that ESPN music for like, dun dun dun, dun, dun, dun, dun, dun, dun. You know what I'm talking about, where it's the motivational coach. [00:38:31] Speaker A: Yeah, yeah. That's what we need in the background. Have like a flag waving and everything. It's gonna be great. So not to bring the mood back down, but of course, Microsoft was not the only company that announced more layoffs this week. Cisco just went through their second layoff of 2024, affecting thousands of employees. And in this case, it was more than double what Microsoft has laid off so far, 5600 employees following an earlier layoff in February, which let go of about 4000 employees. So nearing the 10,000 mark Cisco total just this year. Yeah. [00:39:02] Speaker B: Did they give a reason why? [00:39:04] Speaker A: I don't actually know if they did give a specific reason. They didn't give a reason for the month long delay and notifying staff, which kind of sucks because they announced. [00:39:13] Speaker B: I did hear that they told them that there will be layoffs in 30 days. Yeah. They're like, well, is that me? [00:39:21] Speaker A: Well, right. It's almost like, okay, thanks for the heads up, I guess so I can maybe start looking for work. But at the same time, it's like now I'm just gonna spend the next month, you know, shaking in my boots. Like, is that going to be me? Am I going to lose my job? [00:39:31] Speaker B: That's a sucky way to live right there. [00:39:33] Speaker A: Yeah. [00:39:33] Speaker B: It says that Cisco said in August, in an August statement that its second layoff of the year would allow the company to, quote, invest in key growth opportunities and drive more efficiencies. On the same day Cisco published its most recent full earnings of the year, the company said it was the second strongest year of record, citing 54 billion in annual revenue. That. That seems contradictory. We just. You know, if you're telling me that we're not efficient enough and that's we have a glut and therefore it's coming up the works, but you showed your second greatest like that. That seems contradictory to me. Or is it just a greed grab of money, of going, if we did the second biggest with this many people? Imagine if we laid off almost 10,000 employees and didn't pay salaries and employee benefits. And I, you know, taxes and all this stuff that goes along with having employees. We could really bank well in this. [00:40:30] Speaker A: I mean, I know it's. I. You can tell by looking at me, I have never been a CEO. I've never been close to a CEO. Right. I know the lines, the wrinkles that are just populate my face. You can tell I've really been through it. No, I've never been in a position like that to have to make a decision like this. Obviously, I haven't been alive long enough, but it. I know. Maybe there's limits to what you can say. Maybe you can't get super specific with reasoning, but every time there's, like, an announcement like this, and it's such, like, we're going to invest in growth opportunities, we're driving efficiencies. [00:40:59] Speaker B: Huh? [00:41:00] Speaker A: Like, that's so vague. You're just. You could just pick a random. Well, this is so that we can. Our business can boom and we can meet the bottom line. Like, you can. You could say anything and. Okay, that's your reasoning? I guess when you're a big corporation like that, maybe you can't get specific. I would almost. If I was gonna be laid off or fired or whatever, I would almost rather it be for. Cause, like, I did something or I'm not meeting a benchmark, or like, hey, we've given you six months. We told you this was the goal. You're failing to meet it. Sorry, but you're out the door. At least I have a reason. At least I know. Okay, fair enough. But the ones that I feel like are the worst are, sorry. It's not you. We just. It's you and 10,000 others, basically. And you're not alone in this. And you couldn't have done anything possibly different to prevent this. [00:41:39] Speaker B: And cog in the wheel. [00:41:40] Speaker A: Yeah. We can't give you a reason. [00:41:41] Speaker B: It's not personal. It's business. [00:41:43] Speaker A: We are investing in key growth opportunities, and you are not one of them. Get out the door. [00:41:47] Speaker B: What key growth opportunity. [00:41:48] Speaker A: Right. [00:41:49] Speaker B: That's none of your business right now because you're no longer employed here. It's like, oh, I see what you did. [00:41:53] Speaker A: So, like I said, I've never been in that kind of position. So, you know, I mean, just put that out there, I guess. I wouldn't know personally, but that's just how it comes off to me. [00:42:02] Speaker B: I wonder, like, so Cisco, do Google a Google search for me. What market share does Cisco have over their market? Because I know they're the number one, right? I know they're number one in their space, but I feel like they're, like, number one by a large martian margin. A large martian. [00:42:22] Speaker A: So the first. The first thing to come up was Cisco routers says Cisco has 35.72% market share. [00:42:32] Speaker B: Okay. So it's not as big as I thought I thought for routers, but that's still juniper. What's Juniper's market share? [00:42:37] Speaker A: Gosh, I don't know. And that's it. [00:42:38] Speaker B: You keep talking. I'll look up Juniper. [00:42:40] Speaker A: Let's see. Global enterprise network infrastructure market share. I don't know. Maybe that'll be a little different. Says in 2022, Cisco made up 41% of the enterprise network infrastructure market, so nothing to sneeze at. [00:42:50] Speaker B: Juniper has 10% market share for routers. [00:42:53] Speaker A: Okay. [00:42:54] Speaker B: The hell is the other 50%? [00:42:57] Speaker A: Maybe it's just like. It's so, like, oh, Cox has 10%, and Cisco has, you know, like, who are they using? [00:43:04] Speaker B: I mean. Yeah, that's interesting. [00:43:06] Speaker A: That is interesting. [00:43:08] Speaker B: Maybe. I don't know. I don't know. Who else is manufacturing in 2023, the. [00:43:12] Speaker A: Network hardware market share worldwide, a company called HPE Aruba Networks, led with 56.2%. But Cisco, it looks like, was close behind. It didn't give me a number, but basically they were in second. [00:43:24] Speaker B: Okay. [00:43:25] Speaker A: So I think. But I think, you know. Oh, 30%. But 30% is still a lot. Or 35% or whatever. That's still a big number. [00:43:31] Speaker B: Oh, last again, that just goes to show you, last time I checked the market share for network devices, I don't do that stuff anymore. That network was. Was the king of the Hill or network. Cisco. Cisco was king of the Hill, like, by leaps and bounds. It was. It was that they were the Leviathan of network equipment. If you went to a shop, it was almost guaranteed that it was running Cisco routers and switches. So maybe things have changed a lot in the landscape because of data centers like cloud data centers. [00:44:03] Speaker A: Yeah. [00:44:04] Speaker B: And HPE is. Makes really fast, like, backbone stuff, and so cool. And they probably got a really good deal, that kind of thing. Yeah, it's all like that. [00:44:17] Speaker A: So it's hard to, I think, get a. Excuse me. It's hard to get, like, a solid number of total because it's divided up by, like, oh, specifically with routers, hardware in general, cloud reseller. So it kind of breaks it up a little bit. But in every category, they have, like, about a third. [00:44:31] Speaker B: They're always, like, the big dog. [00:44:32] Speaker A: So they're always up there. They're always up there. So seeing a layoff like this is just. It also. [00:44:37] Speaker B: It's just rough. I hate to see people get laid off. [00:44:39] Speaker A: It also makes me wonder if you can lay off, in the span of less than a year, 10,000 people and have it not impact your business operations. Not. Not. Not directly to those people. But why did you create those positions in the first place? [00:44:53] Speaker B: Riddle me this, Batman. [00:44:54] Speaker A: Right? [00:44:56] Speaker B: How. What. What poor business or business decisions did your executive leadership team make that you had a glut of 10,000 employees? [00:45:06] Speaker A: Yeah. [00:45:07] Speaker B: How did that happen? [00:45:08] Speaker A: I know there was a. [00:45:09] Speaker B: You're either bad at making those decisions or you're bad at making these decisions. One of the two is true. In any case, maybe both. You're. You're not doing great. [00:45:17] Speaker A: Yeah. Yeah. You got some room for employment. [00:45:18] Speaker B: I don't know if. Homeboy. What did it say that Jared, their CEO, got $32 million? [00:45:23] Speaker A: Yeah. [00:45:24] Speaker B: This year, maybe you're not worth that $32 million price tag. [00:45:28] Speaker A: Yeah. [00:45:29] Speaker B: Right. And we need to be looking for somebody else that's not going to steer the ship into a 10,000 person excess. [00:45:35] Speaker A: I remember hearing about. Obviously, I was still in college when. When Covid happened in 2020 and all that stuff, but I remember hearing that during that time when a lot of people went remote and, like, tech jobs were booming and all this stuff that, oh, people were creating positions, they were creating jobs, new job titles and stuff that it was like, okay, maybe it seemed necessary at the time, or, hey, we can do this. We can create these new positions. And then it was like, oh, now maybe there's not as much of a need. So maybe that's part of it. [00:45:58] Speaker B: If I'm remembering. I've heard of that as a, like. As a business strategy where you grow really fast to kind of, like, make yourself look attractive that you have, like, that you're a big company and you do all these things, and then. But that's. I'm not sure I forget why. Again, not a business person, right? Just a tech dude looking from the outside and seeing people losing their jobs and going, why is that? Well, why. Why does your company not see that coming? Or may, and maybe that's legit. Maybe you couldn't see it coming. I don't know. It just sucks that that is the state of it. Oh, I know. We're in. We're in tech, and tech has these. These hills and valleys of. It's booming. It's doing great, and everybody can go get jobs. And then we go, oh, no, down we go. And you start seeing layoffs, and everybody's. [00:46:48] Speaker A: Struggling, but in this case, to have them say it's our second best year on record or whatever, and then. So, okay, I don't know that the reason is that you're in one of those valleys. It doesn't really seem like. It seems like from the outside. From the outside. With. Maybe there's context I'm missing, but, yeah, to your point, I think. I think in a lot of cases that is true, that you see these hills and valleys and stuff and ebbs and flows, but in some ways. [00:47:07] Speaker B: Can't feel your hands, can you? [00:47:08] Speaker A: I can't feel my hands. They're very numb. [00:47:10] Speaker B: I'm over here, like, 30 degrees. [00:47:11] Speaker A: You'll notice I have two shirts on because it is very cold in here. [00:47:15] Speaker B: It is very cold in here. I just saw you, like, pulling your. [00:47:17] Speaker A: Yeah, I'm like, this is. I'm about to go Ariana Grande with just, like, she's just got the sleeves. You can't even see your hands. That's gonna be me. [00:47:22] Speaker B: Get those one with the thumb holes. [00:47:27] Speaker A: Yeah, I'm very cold. My hands are numb. Anyway, so. Yeah. Interesting discussion there. Yeah, on both fronts. So hopefully those folks are, of course. [00:47:35] Speaker B: Hope they land on their feet, able. [00:47:36] Speaker A: To find work somewhere else soon. But this next one is going to be. We have not seen this segment in a while, so I'm looking forward to it. This is a little segment we like to call tinfoil hat. [00:47:47] Speaker B: The moon landing was fake. [00:47:48] Speaker A: Paul McCartney's been dead since 1966. [00:47:51] Speaker B: Dogs can't see color. 5g causes syphilis. Do you understand that? [00:47:56] Speaker A: I love that. I just realized I just processed syphilis. I loved that. Two of those things work. Like conspiracy theories. Like, the moonlighting was fake. 5g causes syphilis. But the other two things were just like, dogs can't see color. [00:48:07] Speaker B: It's like a true statement. Well, they don't. I think they have limited color spectrum. [00:48:11] Speaker A: Yeah. And then what was the other one, it's the moon landing was faked and then. Yeah. [00:48:15] Speaker B: Making the frog scary. [00:48:16] Speaker A: Make. [00:48:17] Speaker B: No, you know what it said? Wasn't the frog scary? [00:48:20] Speaker A: I don't think so. McCartney's been dead since 19. Six. [00:48:23] Speaker B: Yeah. [00:48:23] Speaker A: He said. Do you understand that? I don't think that. I don't think the frog thing was part of the audio, but thank you for that. Oh, wow. We're starting off on a great note for this article. Well, so the reason that we put this under tinfoil hat is because, you know, it might be, but it might not be a conspiracy. Ellison declares oracle all in on AI. Mass surveillance says it'll keep everyone in line. Oh, that's reassuring. That's what I want to hear. [00:48:47] Speaker B: Yeah. This type of verbiage is not something you want coming out of your mouth. [00:48:51] Speaker A: No. [00:48:52] Speaker B: Just as a cursory look at history, I'm seeing that other people have said that we're not cool. [00:48:57] Speaker A: Yeah, well. And it's not even like there was any euphemistic, like, well, you know, it'll be helpful to have surveillance where necessary. He was just fully like, straight up. [00:49:05] Speaker B: We should surveil everything. [00:49:06] Speaker A: He was like, cameras will be everywhere. Citizens will be on their best behavior. Because we're constantly recording and reporting. There are so many opportunities to exploit AI. And I'm just sitting here like, you're staying the quiet part out loud. [00:49:18] Speaker B: Yeah. I'm like, where did I put my copy of 1984 with the big telescreen in everybody's apartment or pod or whatever the heck it was? Everybody living in the same thing. Right? You got the telescreen. Oh, comrade. Comrade, you're not doing your exercises correctly. Right. [00:49:35] Speaker A: Oracle's gonna start scanning my brain waves for like, for like, fuck crime or whatever. [00:49:39] Speaker B: Exactly, right. The Ministry of. [00:49:42] Speaker A: Ministry of Truth. Yeah, Ministry of Truth, yeah, they're gonna start. So they started this guy, Larry Ellison, who's the co founder of Oracle, he started with the argument that, well, for situations like body cams, right, saying like, the police will be on their best behavior doing what they're supposed to do. Cause we're constantly watching and recording everything that's going on. No ability to disable the feed to Oracle, so you can't turn them off. [00:50:04] Speaker B: Right. [00:50:05] Speaker A: And then one of the things he said was even for like, bathroom breaks or meals, you can't turn off the feed or disable it. It's just that in order to access it, somebody would have to get a subpoena to be able to view it. So he starts with that in this argument of like, well, it's to protect people. It's to protect people. But then he moves on to, we'll just record everybody and report on everything all the time. [00:50:25] Speaker B: Isn't that how all fascist regimes start telling you? Like, oh, it's for your good. [00:50:30] Speaker A: Right, right. [00:50:31] Speaker B: It's to protect you. [00:50:32] Speaker A: Yeah. [00:50:33] Speaker B: And we'll never use. I know. Yes, we're collecting it, but you have to have a subpoena before you get your hands on it. But we are collecting it so that we can subpoena it when it's necessary. This is interesting because that was never abused in the past. [00:50:46] Speaker A: Right. Yeah. History doesn't repeat itself. [00:50:48] Speaker B: Yeah, it just rhymes. [00:50:49] Speaker A: No, I guess he's the CTO of Oracle. He also made the comment that maybe drones could be used to pursue police suspects instead of relying on patrol vehicle chases. I'm curious, though, if you've just got a drone following somebody, does that just tell. Okay, it looks like they're going this way. Intercept them. [00:51:08] Speaker B: I'm not necessarily against the idea of. [00:51:10] Speaker A: Using a drones for, like, apprehension of. [00:51:14] Speaker B: Right. Because there's the idea that when cop cars get involved and they're chasing somebody down the highway at 115 miles an hour, that a wreck is most likely going to occur, and then we have damage and loss of life, possibly or injury. So you just go, cool, drone it. And the drone goes and just follows it wherever it goes. And they'll. They don't know what's up there, and they just think, oh, I lost the cops. [00:51:37] Speaker A: Yeah. [00:51:37] Speaker B: Right. [00:51:38] Speaker A: Then you just wait and intercept them. [00:51:40] Speaker B: And then you go, hey, they look like they're headed in this direction. You use AI to try to predict where they're gonna go. And I don't have any real issue with that kind of thing. And these are gonna obviously be specialized drones on specialized frequencies. That but. But there's always a but. It's a big but, and I cannot lie. Right. [00:52:04] Speaker A: I was wondering. [00:52:05] Speaker B: Yeah. If I was gonna go down that road. I did. I did it. I'm here. That still is a little scary. [00:52:12] Speaker A: Yeah. [00:52:13] Speaker B: Right. That. The fact that. [00:52:15] Speaker A: Where do you draw the line? [00:52:16] Speaker B: Right. Because it seems like we're militarizing our police force. There's a reason we have posse comitatus and all these laws that say you have to declare martial law before you start doing all this stuff. And we do say this. I don't want to live in a police state. That is what I'm worried about. So you might say, well, Daniel, it's a slippery slope argument to say that it will become a police state. And you're right. But slippery slope slope arguments aren't necessarily. [00:52:50] Speaker A: It would be foolish to not acknowledge the possibility. Not saying that it's guaranteed that's going. [00:52:54] Speaker B: To happen, but it's a non zero. [00:52:56] Speaker A: Right. You can't say it's not possible or even plausible that that might happen, that. [00:53:00] Speaker B: Governments would overreach and use that. [00:53:04] Speaker A: Historically. [00:53:05] Speaker B: Historically it has happened. [00:53:06] Speaker A: Yeah. [00:53:07] Speaker B: If history is any kind of indication of what the future is going to be. Like, we do this. [00:53:13] Speaker A: Yeah. [00:53:14] Speaker B: People obtain power so that they can have access to those things so they can use that to maintain their power. [00:53:21] Speaker A: It's just funny that this guy went in that direction of like, we'll just monitor everybody all the time and everybody will be on their best behavior. He could have very easily just stopped with the, like talking about the drones. And then another thing he mentioned was using satellite imagery of farms analyzed by AI to forecast crop yield and suggest ways to improve field conditions. [00:53:36] Speaker B: That's totally cool, but I don't need the government doing that. [00:53:39] Speaker A: Right. [00:53:39] Speaker B: If I'm a farmer and I want to analyze my crops, that's my purview, that's my distinction. And you don't need that information stored in AI, you know, oracle database. [00:53:50] Speaker A: Right. If a farmer as a, you know, wants to do that or utilize that service and pay for it, that's different. So I think he could have made these suggestions and brought up these ideas and just left it there and been like, hey, AI can be this cool thing for the future, but it's when you cross the territory into, we'll just have cameras on everybody all the time and you can't turn them off. And like, you just, okay, here's what, now you've lost me. [00:54:09] Speaker B: Here's what we as citizens, honestly, we really need to do is it needs to be the barrier to creating your own AI on metal at your house should be low, right? Because. Yes. Might I get more power out of, out of utilizing a large server farm? Of course. But trade offs, right? Decentralization means that they also don't, don't have access to that data, at least not easily. I'm a hacker, I totally get that. Things can be hacked, sure. But to decentralize all that away from them and bring it into your own space, you get a lot more control, you get a lot less eyes on it. And that would be an interesting project to try to create a step by step. I don't know how difficult. I never tried to create an AI system at home. I know it's possible it's just like what kind of money is involved in that? And again, I would like to see what we can do to reduce that barrier to entry of doing that. So everybody can run their own LLMs and everything at the convenience of their own home and they don't have to worry about that. Training other people's AI as you train your own AI, which I know again is possible, I just don't know what the barrier entry is. [00:55:21] Speaker A: Plus, if it's easier then for folks to do that on their own, maybe then you become more informed and have a better understanding of it and so there's less fear of it. Right. And you're more, you know, deal with it. Right. I think if this guy, if he were just some random dude from like California that was like trying to do a tech startup and he was like, look at the potential of what we could do with AI. I don't know that I'd be as concerned, it would be like, okay, dude, like, let's calm down. But this guy is co founder of Oracle, right? Oracle's been doing a lot more business with AWS, with Microsoft. I think Elon Musk is now using it in conjunction with rock Rock. So it's like, this is not to work with big names like that and have this kind of mentality or ideas. It's like, okay, so how does the CEO of Microsoft feel about that? Is he in agreement with you on that? And in which case, at what point does Microsoft start employing that kind of technology? Oracle's done deals. They're a longtime us contractor. [00:56:07] Speaker B: So it's like you don't say. Right, right. Oh, is that what you are? [00:56:12] Speaker A: It's, it concerns me a little that the position that you're in, and these are the, this is what your thought process is on this. [00:56:17] Speaker B: Right. [00:56:18] Speaker A: It just, I don't love that. And you don't see a problem with it? Just interesting. [00:56:22] Speaker B: I don't know how that is. Like, what have you not seen Terminator or something? Like, or any, like equilibrium or the Matrix or any of these weird dystopian future? [00:56:32] Speaker A: Have you learned nothing? [00:56:34] Speaker B: Yeah, like, come on, man. [00:56:36] Speaker A: You would think. You would think. That book that I'm reading right now, that's 1984. [00:56:40] Speaker B: Make Orwell fiction again. [00:56:41] Speaker A: That's next on the list. Yes. The book about Clearview that talks about like the origins of it, how it started. And it started as this service called, the original inspiration behind it was a service called Smart Checker, and it was supposed to be used to check people's political backgrounds and stuff. So that's kind of where it got its origin. But at one point. [00:56:59] Speaker B: Well, isn't China doing that right now? Well, yes, they have social credit scores and if your social credit score is too low, you can't buy plane tickets, you can't ride the train or whatever. Right. [00:57:09] Speaker A: It's like the smart checker thing was like a us based thing. It was like ten or so years ago that it first kind of was a seed that was planted. So I forget who it was that did the interview with somebody from Google. But they basically, somebody from Google spoke off the record and they were talking about this technology. It was when Google, like image analyzation and image search was first coming into play where you can, hey, take a picture of a bird and it'll say, oh, this is the same bird it was when it was first coming onto the field and it wasn't super well trained yet. Worked for some stuff, but not for others. Right. And so they were like, well, you realize, you know, these guys with smart Checker, they've got this technology and you could employ this. And the folks at Google were basically like, so we know we've got that here. We just haven't started using it yet. Cause we don't know if it's a good idea. We're afraid to. I feel like if you've got people at Google saying that you should probably. Okay, that tells you it probably is something that we need to be very careful with because I feel like folks at companies like Google and Microsoft are usually pretty open to, like, let's do this. It's progress. So if even they hesitate. [00:58:04] Speaker B: Yeah. In my mind, and I feel like this is just should be an innate human reaction to the gathering and collating of every single piece of information about you. [00:58:16] Speaker A: Yeah. [00:58:17] Speaker B: For. Well, we'll just say even for advertising purposes. Even for advertising purposes, I don't need you to be that well acquainted with me to advertise to me. Right. There's no good purpose for it. If I need soap or trash bags or whatever it is that I need to buy to go throughout my day, I'm just going to go to the store and go, I get. You want me to buy your trash bags, but I feel like we're. We have crossed the line too far and too far over the line as far as what you're gathering about me so that you can make sure I buy your trash bags. [00:58:53] Speaker A: Yeah. [00:58:54] Speaker B: Right. There needs to be a buffer zone between me and the people selling me stuff about what they know about me. I would agree that that is my that is my strong opinion. [00:59:04] Speaker A: Snapchat's got a service now where they're incorporating your selfies into personalized ads for you, and you have to go in and opt out of it. And you. [00:59:11] Speaker B: Which is why I don't have Snapchat. [00:59:12] Speaker A: Right. [00:59:12] Speaker B: Or Facebook or Insta or. I mean, technically I do, but I don't want it. [00:59:18] Speaker A: I feel like Instagram is more widely used. You're in your forties, right? If you in your forties had Snapchat, I'd be a little bit like, why? So it's probably good you don't have Snapchat anyway. Cause it's like people even. I don't really use it much anymore. [00:59:30] Speaker B: I don't use WhatsApp. [00:59:31] Speaker A: People out of high school generally don't. [00:59:33] Speaker B: I don't use TikTok. [00:59:34] Speaker A: Yeah. [00:59:35] Speaker B: I don't. I'm not on x. [00:59:36] Speaker A: But you're. I think you're right. With the, the customized ads and stuff, it's like, there's a point where it's like, you don't. I don't need personalized ads even, that just know, like, what kind of soap I like. I definitely don't need you putting my face in ads. [00:59:47] Speaker B: Oh, I. And I stay logged into Google with like, a fake account. [00:59:50] Speaker A: Yeah. [00:59:51] Speaker B: Like, it's. [00:59:52] Speaker A: You don't need that information, right? [00:59:53] Speaker B: I'm VP inning and I'm like, like, screw you. [00:59:56] Speaker A: No information for free. [00:59:57] Speaker B: I got burned for, you get nothing. [01:00:01] Speaker A: Well, on the, on the subject of, you know, social media apps and stuff, using information, using public posts and things to find information about, you meta is going to be training their AI models using public UK Facebook and Instagram posts. And we've, we've touched on this before, we've talked about this kind of thread before. But in this case, the thing that made this interesting to me is that the UK, or, you know, I guess it's the European Union and then the UK has their own set now. But GDPR, right, the UK's version of GDPR, that's the whole thing about, like, you have the right to deny them use to your data, you have the right to disappear or whatever it is. Right. [01:00:34] Speaker B: And they generally right to be forgotten. [01:00:36] Speaker A: So it's called the right to be forgotten. And they've enforced that. It seems like, especially recently, pretty well, there have been a lot of, like, lawsuits and stuff that have cropped up where like, hey, you can't do this. You're breaking rules. [01:00:45] Speaker B: Yeah, but didn't, like, google just win that? Like, it has overturned or whatever they. [01:00:49] Speaker A: Think that they're gonna. It's gonna be appealed. But yes, Google for now, has won that lawsuit and got out of, like, I think it was like a 1.5 billion euro fine they were gonna face and they got out of it. So. But I think it's gonna be appealed. [01:01:00] Speaker B: So we'll just, like some interesting things around Ireland when it comes to GDPR. Like, you could say it's basically a haven of some kind. It's weird. I'll have to look more into that. But I just heard about this, that, yeah, you'll see a lot of these companies setting up shop in Ireland specifically so that they can, like, circumvent things like GDPR. [01:01:20] Speaker A: Yeah. [01:01:20] Speaker B: Yeah. Weird. [01:01:21] Speaker A: It is. Yeah. It's crazy times for a living, isn't it? It just gets weirder every day. Well, as part of this process, users aged 18 and above, and the reason for that is, I think if you're under the age of 18, they can't use your data at all. But adult users, that's who this is going to affect. Users age 18 and above are expected to receive in app notifications starting this week on Facebook and Instagram, basically explaining what's going on, how you can access an objection form to deny your data being used. So you have to that to me. And maybe, maybe I'm stupid, I don't know. But that, to me, is the part that's kind of like, I don't love that. I don't love when you have to go through extra hurdles to opt out of your data being used. I feel like the default should be, we're not going to use your stuff. And then if you're okay with it, hey, check this box and we'll go ahead and then we'll use it. [01:02:02] Speaker B: And I'm even cool with like, you know, once a week or whatever, you get a prompt saying, hey, would you like to opt in? [01:02:09] Speaker A: Right? [01:02:10] Speaker B: Or would you, and would you like to see less of these prompts? [01:02:12] Speaker A: Right. [01:02:12] Speaker B: And then it goes to like, once a month, once, you know, every quarter or whatever. [01:02:17] Speaker A: Yeah. [01:02:18] Speaker B: And that, that's totally fine. And again, make it super easy for the user to go, this is what we want to do. Are you cool with that? [01:02:25] Speaker A: Right? [01:02:26] Speaker B: If not, you're already out. We'd like you in, but if you don't want to, you just hit, no, thank you, and you move on about your day. [01:02:32] Speaker A: Right? You can't do what Roku did, where in order to opt out of the terms of service or deny them, you have to write a letter. [01:02:36] Speaker B: I gotta fax them a letter. [01:02:39] Speaker A: It's just not the way to go. [01:02:40] Speaker B: And then by carrier pigeon, it shows up to, you know, no, on the. [01:02:44] Speaker A: Night of a full moon, you gotta meet all these requirements. So, yeah, I think you're right in that it's got to be user friendly. [01:02:52] Speaker B: It's got to be the 30th day of February, right? [01:02:55] Speaker A: The fifth of never. Right. [01:02:56] Speaker B: Yeah. [01:02:56] Speaker A: And it has to be obvious what you're agreeing to or opting out of to use this like, that would be my opinion, I guess. But to be flowery about it and make it so that people can understand. [01:03:06] Speaker B: What does that tell you about those companies? [01:03:09] Speaker A: Well, I mean, they wanted to make it as difficult as possible for you to opt out of it because they want your data. [01:03:12] Speaker B: You said that, not me. [01:03:15] Speaker A: Somebody's gonna break it. FBI, open up. So they did say that. We'll honor users choices. We will not contact users who've already objected to their data being used. We won't include private messages with friends and family. I feel like that should be a given, but. Okay. Or information from accounts of minors. And of course, that doesn't really include. You can sign up for Instagram and say that you're 18 and you're really 50 people. But all the time. [01:03:37] Speaker B: I mean. Yeah, I mean, I wish there was some sort of. [01:03:40] Speaker A: There's not really anything meta can do about that. [01:03:42] Speaker B: Right. [01:03:42] Speaker A: It's like the most they can do is say, hey, we're trusting you. Unless they start asking for idiot. [01:03:46] Speaker B: More things they could do, they just don't because the law does not compel them to. [01:03:50] Speaker A: Right. Well, and I don't know what you would do. Would you ask for id? Cause I don't want to share my id. [01:03:54] Speaker B: You know what, I don't know off the top of my head, obviously, you would have to set up some sort of functionality. If you can build AI. I feel like you could figure out a procedure for verifying somebody's age. Just saying. [01:04:08] Speaker A: You probably could. Yeah, you probably could. So. But at least for now, they're not going to use data from minors, which is good. Again, I feel like that should be a given. It's. Yeah, because it's. [01:04:17] Speaker B: How much do you trust that to be true? [01:04:19] Speaker A: Right, right. Yeah, it'll come out in, you know. [01:04:22] Speaker B: Oh, well, actually you opted. Because you didn't opt out. You gave me permission. [01:04:27] Speaker A: Well, and how do you handle. [01:04:28] Speaker B: Bypass our policy that says we don't use minor account. We don't use minor data. [01:04:33] Speaker A: And there's a lot of accounts, especially like child celebrities. You'll have like a, you know, oh, you have to be at least 15 or 13 or whatever to have an Instagram account. So you'll have like a ten year old that's like a famous actor and account managed by mom or account managed by parents. [01:04:45] Speaker B: So then let's say I've got an Instagram account. I'm a huge fan of some child celebrity, and I'm, you know, following around, let's say they're a singer or something. I'm following around on tour. I'm taking pictures. I've got a whole instant and it's nothing but them. [01:04:57] Speaker A: Yeah. [01:04:58] Speaker B: Is that technically minor data? [01:05:00] Speaker A: Minor data? [01:05:00] Speaker B: Or is it my. Because I posted it and I'm 44, right? Year old weirdo. Yeah. [01:05:06] Speaker A: Or even just if it's a parent. Like, if it's data. If the account is entirely about some kids celebrity. I can't even think of a celebrity right now. But they're obviously, they exist child ones. [01:05:14] Speaker B: That pop into my mind since I do have kids, like the ninja kids and fun squad. These are all kids. Their parents run the accounts. They're ma. Are you kidding? They're massive. [01:05:26] Speaker A: I don't have kids. [01:05:27] Speaker B: So that, you know, gruff they are. These kids run YouTube. [01:05:31] Speaker A: Okay? [01:05:32] Speaker B: I'm not even joking. Trinity and Madison, like all these. All these things. These. Bruh, I know way too much about kids because I have little kids. Yeah, yeah. And it's like, right, those are minors, but their parents run those accounts, I'm sure. [01:05:49] Speaker A: So where's the line then? It's kind of, these are issues that we in, even in the last ten years, 15 years, 20 years, we haven't really come up against. And now it's becoming more of a question. [01:05:59] Speaker B: Right. And YouTube has a policy that says, so if I post a video, there's an area when I'm uploading the video says, I allow people to repost and use this content in reposts. You can check that off or leave it unchecked or whatever the case. The default is. I think the default is to allow people to use it in reposts. You have to. You have to check it to say, I don't want them to be able to repost it without x, y or z reasons. [01:06:22] Speaker A: Sure. [01:06:22] Speaker B: Right. So again, weirdos out there going, oh, yeah, I'm watching the ninja kids and I'm cutting them post. I'm reposting their stuff. Does that fall under? See how weird it gets? The better thing to do is just say no. Thank you. [01:06:37] Speaker A: Yeah, just say no. [01:06:39] Speaker B: If you want to train your AI, hire a bunch of photographers to go out in public areas and take pictures. [01:06:45] Speaker A: Yeah. [01:06:46] Speaker B: And do the thing and gather that information that is in a public area. [01:06:50] Speaker A: Yeah, I would agree. [01:06:51] Speaker B: Right? [01:06:52] Speaker A: I would have to agree. [01:06:53] Speaker B: You got money, use it. [01:06:55] Speaker A: There are definitely certain governments and individuals, I'm sure, that are not in love with this idea. So we'll have to see how this plays out. And if there is a whole lot of pushback against this. [01:07:07] Speaker B: What does it tell you about companies like meta? You take a picture and you post it on Instagram. You do not own that picture. Yeah, they own that picture. That is their picture. Right. As long as they don't do anything illegal. [01:07:20] Speaker A: Right. [01:07:20] Speaker B: With that content, it's not yours. [01:07:23] Speaker A: I don't really have a leg to stand on. [01:07:24] Speaker B: That's exactly right. [01:07:25] Speaker A: It's unfortunate, but it is the way it is. It's the world we live in now, FYI. Stay tuned for. [01:07:30] Speaker B: I wonder if I took a picture with my phone. I have that digital media on my phone, which I own. Right. If I upload said picture to, like, Instagram or whatever, that is a. I believe it is transformed. There's going to be compression, and especially if I use filters and everything there now it's a derivative of the original. Do they just own the derivative and I own the original? So if. If I continued to use that and they tried to stop me saying, oh, no, no, we own that, would they have a leg to stand on? That's an interesting thought. Anyway. Sorry. Derailing conversation. [01:08:07] Speaker A: No, that's good. That's what we're here for, and we do have. I'm sure this will inspire a little bit of conversation, is our last thing of the day, I promise. Uh, but the Apple vision Pro. Actually, no, I might have pulled the wrong article here. I might have pulled the wrong article. There was an article we wanted to talk about. No, no. Maybe it is. [01:08:22] Speaker B: Hacker vision for vulnerability exposed keyboard inputs to attackers. [01:08:26] Speaker A: I did do a hard reset at the break, but I was wondering if it did, when, if it did go through. So. Yes. Apple vision pro vulnerability exposed virtual keyboard inputs to attackers. This, to me just was. It kind of just raises the issue of these new novel attacks that this wasn't exploited, it was patched that we know of before anything could be exploited. So that's good. It basically would allow the attacker to infer. I related biometrics to reconstruct text entered via gaze controlled typing. If you said that sentence to a victorian child, they would die immediately. Like I. This. [01:08:59] Speaker B: I've always said there is a, like a time machine test on things. Go back to, like, 2013 and say, oh, you know, in the future, there's going to be a respiratory virus that shuts down the entire world. If they check up your butt or whether or not you have it, they will lock me up as a lunatic person. They would be. This guy has lost it. [01:09:22] Speaker A: This sounds like a, like a plot from a Sci-Fi like a dystopian film, right? Gaze controlled typing. And I related biometrics from the avatar imagers explode. [01:09:30] Speaker B: Like. Right. [01:09:31] Speaker A: Like, it doesn't sound real. [01:09:33] Speaker B: I feel like that kind of goes along where we started this episode, and now here we are bookending it full circle. Another device that is making you exposed because we continually start to integrate technology into our lives to great extents. And I'm not an anti technology person, obviously, I work in technology. [01:09:53] Speaker A: Clearly. [01:09:53] Speaker B: Yeah, I like it a lot. I just don't like how that technology gets connected to everything. That's where I'm like, I should have the ability to cut off that connection, honestly. So there is a phone out there. It's like, I forget the name of it. I have it bookmarked at home or whatever. I'm definitely going to go to it that I saw on a podcast. When you turn the wireless off, it physically disconnects the wireless device from the phone. [01:10:24] Speaker A: Yeah. [01:10:25] Speaker B: Right. If you. The cameras, when you. You can literally close this, like, open the circuitous so that the cameras cannot be used, like physical connections inside this phone. I'm like, I like this. [01:10:39] Speaker A: Yeah, that's pretty cool. [01:10:40] Speaker B: When I want it connected, I'll connect. [01:10:42] Speaker A: Gives you more control. [01:10:42] Speaker B: Right. It's not storing information about me. All the little. All the little bells and whistles inside of it that I'm not really privy to. That is so fun fact. You turn your phone off, like, powered off battery. It's not off. It's still on. Right. And it is collecting information. You're not using it. Okay, cool. So your downtime is. It's. It's doing all these things to gather information. Run. Run your phone through a network sniffer. [01:11:12] Speaker A: Yeah. [01:11:12] Speaker B: And watch all the stuff that it does. It's insane. Right? So this phone, when you disconnect, when you turn it off, the battery physically disconnects or you have the ability to physically disconnect the battery. [01:11:23] Speaker A: So you know for sure. [01:11:24] Speaker B: Yes, it is. There is no power to it, and it's not collecting information about you. That's not its job. Its job is to be a phone. Give you a browser, make phone calls, do texting, take pictures, connect to the Internet when you want to, and then when you don't, you can turn all that stuff off. And it doesn't do that. [01:11:45] Speaker A: The argument a lot of times for, for instance, like, Google is a free service, but it's not free because you're paying with your data. Right? Things like that. Gmail, whatever. In this case, though, it's like you pay $1,000 or whatever it is for the newest, the iPhone 16 or whatever that just came out. And then you continue to pay with your, with every little piece of your data and information. [01:12:01] Speaker B: I just, I was telling you this morning at our standard meeting, I listened to a talk yesterday from Defcon, and it was called d and shit, a fire die. And it was all about how, it's a lot about how products that we have grown to know and love and become, like, reliant on were really cool when they started and now kind of suck and that they, we hold each other hostage by continuing to use it, but we can't walk away because that's where all our, our friends and stuff is. So we're kind of stuck in this, this, like, circular tornado of, of staying in. But one of the things the person said in their talk was, yes, it is true that if you are not paying for a product, you are the product. But now it is also true that if you are paying for the product, you are also the product. [01:12:46] Speaker A: Yeah. [01:12:46] Speaker B: Right. It doesn't matter. They're spending money. How many times do we see that? Companies are losing money giving you the product. Right. They take a loss because it doesn't matter. They make more money on what the product does once it's in your hand than they do on the actual product itself. [01:13:04] Speaker A: No. [01:13:05] Speaker B: Right. So they don't care that they sell it at a loss. [01:13:08] Speaker A: And you think, oh, great, this is a great price and a great deal. You're. [01:13:13] Speaker B: Yeah, you're getting a piece of crap. [01:13:14] Speaker A: You're selling it, surveilling you. Yeah, yeah. You're selling a different part of yourself. You're selling your information, I guess, without really even probably knowing it or cognitively, like, being aware of it. [01:13:23] Speaker B: Yeah. [01:13:23] Speaker A: So it's a scary world. It's great. [01:13:25] Speaker B: I'm further and further starting to want to, like, really anonymize and go as anonymous as possible. [01:13:33] Speaker A: Yeah. [01:13:34] Speaker B: And I wish that was what the Internet was, is that we were just all big, fat anonymous people. No, but you had, like, an enclave to wherever. If you want it to be you, you could be you. [01:13:42] Speaker A: I think we're moving, if anything, in the other direction. Where there are folks pushing for, like, AI credentials, everything online to prove that you're not a deep fake, to prove that you're a real person, you're not a made up Persona. Pushing that you have, like, basically a virtual id. [01:13:56] Speaker B: That's. [01:13:56] Speaker A: That's for you. [01:13:57] Speaker B: Because we've utilized technology to create fake people. [01:14:00] Speaker A: Right. Yeah. [01:14:01] Speaker B: And very convincingly. [01:14:02] Speaker A: So I think we're going. It's like, unfortunately, in the opposite direction. [01:14:05] Speaker B: This seems paradoxical. [01:14:06] Speaker A: Yeah. Yeah. It's a scary world. Not to end on, like, a sour note, but it's like you said earlier, like, unless you're willing to become a Luddite, I guess, and disown technology, it's hard to. The next segment, am I. At what point are we gonna start an episode? And Daniel's not gonna be here. So. Daniel has denounced computers. So, you know, love him, praying for him, hope he's doing okay on the woods. [01:14:28] Speaker B: I totally understand that. I have a lot of, like, seemingly contradictory viewpoints. I'm just working it out. I'm just. This is my therapy session. [01:14:37] Speaker A: We all are, Daniel. We all are. We're here for you. But I think that was our last news piece of the day. Of course, there's a bunch of other stuff that comes out during the week. So if anything comes up in the next few days that you want us to cover next week's episode, please leave a comment. Let us know. Subscribe so you never miss an episode in the future, leave a like, if you enjoyed this episode, we do read the comments. We do like seeing people's input, answers to stuff. It is super fun to see what y'all have to say. So I would highly recommend you do that. And don't forget, as you can see by my Deadwood shirt, we will be in Deadwood, South Dakota in just a few short weeks. We will be there for. [01:15:07] Speaker B: Spider man will be there. [01:15:08] Speaker A: Yeah. Peter Parker over there is going to be joining us. We will be there the 8th through the 11th or 12th, I think. Yeah, we'll be doing Technado from Deadwood, so that'll be super fun. And if you're going to be there, we would love it if you came and found us and said hello. But, yeah, that's pretty much it. That's all I got. [01:15:23] Speaker B: Yeah. [01:15:23] Speaker A: Anything on your end? [01:15:24] Speaker B: Take a selfie while you're there and sell it to meta. [01:15:26] Speaker A: Yeah. Or do it in Snapchat. And Snapchat will do it for you. Thank you for joining us for this tinfoil hat, Technato. And we hope you enjoyed. We'll see you next week. Thanks for watching. If you enjoyed today's show, consider subscribing so you'll never miss a new episode.

Other Episodes

Episode 330

October 19, 2023 00:51:24
Episode Cover

330: Hackin' In The Wild West! (WWHF SPECIAL)

Reporting live from South Dakota, it's Technado: Deadwood Edition! Daniel, Sophie, and special guest Ronnie are on location at Wild West Hackin' Fest far...

Listen

Episode

September 16, 2019 00:57:49
Episode Cover

The Technado, Episode 117: Trapezoid’s Robert Rounsavall

Robert Rounsavall from Trapezoid joins to talk about the importance or firmware security. Peter, Don, and Justin also break down a strange affect the...

Listen

Episode

October 05, 2017 00:21:43
Episode Cover

ITProTV Podcast 10: Preventing a Beachhead

Give hackers and inch and they'll take, well, all your data. That's why it's so important to prevent beachheads, the term for a small...

Listen