Episode Transcript
[00:00:04] Speaker A: You're listening to technato. Welcome back to another episode of technato Halloween edition. Kind of. I'm your host, Cozy Bear. Kind of. And of course I'm here alongside Daniel Lowry.
[00:00:17] Speaker B: And is she not just the cutest most adorbs apt you ever did?
[00:00:21] Speaker A: Oh, gosh. And this is. You know, I talked to my mom about this. She brought up the point. Hey know, because you're going to go in there and it's like kind of a cute costume.
[00:00:28] Speaker B: Yeah.
[00:00:28] Speaker A: Why are we giving them names like this? Why do we name them either adorable or horribly threatening cool sounding names? I am. I will start a petition, Diaper boy or bust. If you name a threat Diaper Boy and I came in here dressed as. That'd be terrifying. Like, but for the wrong reasons.
[00:00:43] Speaker B: Yeah.
[00:00:43] Speaker A: You wouldn't be terrified. Like, wow, he's cool and I revere him. It would be like, oh, he's kind of messed up. Like we should probably do something about that, you know. So anyway. Yeah. Your point being? Oh, it's a cute costume. Threats should not be cute. No, they should not be cute.
[00:00:55] Speaker B: They shouldn't be like any kind of appealing.
[00:00:59] Speaker A: Right. Take that as a lesson. Take that as a lesson.
[00:01:01] Speaker B: Stop doing it, kids.
[00:01:02] Speaker A: Stop doing it. So anyway, yeah, of course that's. I'm not really Cozy Bear. I should have pulled my action figure down for this segment. It's somewhere back there that's.
[00:01:09] Speaker B: That's a little too badass too.
[00:01:11] Speaker A: That. Yeah. That one's actually kind of cool looking. He does have the first action figure. I'll pull him down later and we'll. We'll show him after the break or something. But, but yeah, I've wanted to. Wanted to dress up today because this episode will air on Halloween. Daniel doesn't need a costume because he's scary enough as it is. So that's why, you know, you're dressed in black though.
[00:01:26] Speaker B: I'm a serial killer. Right. I look just like everyone else.
[00:01:29] Speaker A: Very true, very true. I guess you could be. You could pretend that you're a black hat hacker for the day. That could be your costume because you're wearing your hacker.
[00:01:35] Speaker B: I am wearing my lovely.
[00:01:38] Speaker A: Yeah, yeah.
[00:01:38] Speaker B: By the way, it gets like it's like 85 degrees outside, but it's about 58 degrees in here.
[00:01:44] Speaker A: It's freezing. It is, yes.
[00:01:46] Speaker B: Super cold in the studio.
[00:01:47] Speaker A: So this costume's twofold because it keeps me warm unless. Except for my arms. Arms are just out of luck.
[00:01:52] Speaker B: Y.
[00:01:52] Speaker A: But in the. In keeping with the theme of. Of Halloween being this Week. We've got some articles this week that are just downright scary. So it's going to be fun. We do have a couple segments that we're going to get into, so I'm looking forward.
[00:02:02] Speaker B: Scary as a too far put out trailer hitch.
[00:02:04] Speaker A: Yeah, you just shin kick to get it away from you.
[00:02:07] Speaker B: Yeah, we watched the video right before this. That had me in stitches.
Look, dog, I ain't trying to be out here handling things two or three times. Whoa.
Guy sin kicking a trailer hits.
[00:02:31] Speaker A: He's going to start thinking about it halfway through this episode and he'll just be sobbing it straight up.
[00:02:35] Speaker B: Broke my computer, man. I was, I was having trouble.
[00:02:38] Speaker A: He was struggling, so I was on the struggle bus. Hopefully we can get through at least the first half without him breaking, but we'll see. You know, that could be the scariest part of all. We never know when he's going to snap.
[00:02:46] Speaker B: It's going to happen.
[00:02:47] Speaker A: But one of our favorite segments here and the one we usually like to start with, breaking news.
Breaking news never gets old.
[00:02:55] Speaker B: Like when you break your shit on a trailer hit.
[00:02:58] Speaker A: Oh, we're off to a roaring start. It's going to be great.
[00:03:01] Speaker B: Oh, man.
[00:03:02] Speaker A: Well, in. This isn't so much scary as it is just a little ridiculous. But Russia is attempting to find Google 2.5 undecillion. Or actually decillion. Sorry, yeah. Dollars or more money than actually exists on Earth. All because it's upset about some YouTube channels.
[00:03:20] Speaker B: It's so. It's like Russia has finally gone full comic book bad guy.
[00:03:26] Speaker A: Yes, right.
[00:03:26] Speaker B: Unless you pay me 25 decillion dollars or however, I guess it's 2.5 decillion dollars.
[00:03:34] Speaker A: Yes.
[00:03:35] Speaker B: Then, you know, we will release the crack in the pond. What are you going to do again? Nothing. Okay.
[00:03:42] Speaker A: Yeah, it is a. It's a one followed by 30 or a 2.5, I guess, followed by 33 zeros. So that is a lot. And in rubles, that is, I believe, two undecillion. So in Russian currency, that's, that's even more.
[00:03:57] Speaker B: What happened?
[00:03:58] Speaker A: What happened?
[00:03:58] Speaker B: You know. So from what I understand, this is the story. Vladimir Putin has a YouTube channel, he's a streamer, you know, he gets into gaming, he's got a lot of followers. And then he got a hard strike. And when he appealed it, they said, well, you did something wrong. He said, what did I do wrong? And they said, we can't tell you.
That's funny. So here's what happens. YouTube does that shit a lot.
[00:04:23] Speaker A: I was gonna say it's just like. That's true.
[00:04:24] Speaker B: They love to do that.
[00:04:25] Speaker A: It's not even like it's a joke, but not really because that is, that's the reality.
[00:04:29] Speaker B: So I can understand their rage. Yeah, none of that is true, by the way.
[00:04:35] Speaker A: So the main kind of thing that's talked about in this article is the fact that, you know, there's not even that much money on earth. Even if you took everything you could possibly conceive as money, like above ground gold supply, crypto, all that stuff, it would be like several quadrillion dollars. So not even close to the amount that they're finding. So it's just, it's almost just like you're just trying to make a point.
[00:04:52] Speaker B: Yeah.
[00:04:52] Speaker A: Like you're not.
[00:04:53] Speaker B: Pay them in doge.
[00:04:56] Speaker A: Yeah, Here you go. It's. It'll be fine.
[00:04:58] Speaker B: Done. Glad we could do business. Buy low, sell high, you know, man, we messed up. My bad.
[00:05:03] Speaker A: And apparently the reasoning for this is because YouTube blocked Russian YouTube channels, specifically Sargrad, I think that's how that's pronounced. And Ria fan, R, A, F A. And I'm not sure if that's how they would prefer me to say that. So a couple of channels specifically that they claim were targeted, that they won lawsuits.
[00:05:21] Speaker B: I'm surprised that Google's or YouTube is still publishing Russian based.
[00:05:26] Speaker A: We're seeing a trend more and more of like different companies and people that are totally shutting out, you know, Russian users or anybody that's based in Russia. Saw what happened with Kaspersky, which apparently that's pronounced Kasper sky. And I've been pronouncing it wrong.
[00:05:38] Speaker B: Really?
[00:05:38] Speaker A: I. Yeah, I got a bunch of comments on like a tik tok I did about it that was like, you're pronouncing wow. And I'm like, okay, that's a neckbeard moment.
[00:05:45] Speaker B: So fun, fun fact. When someone pronounces something wrongly, it's usually because they've never heard it said out loud. It's always been in a written form. So. And that's where you get tomato, tomato.
[00:05:58] Speaker A: Really?
[00:05:59] Speaker B: That's what happens. That's where you get those differences that are generally accepted and it becomes. Oh, that is an accepted way of saying it. I would argue at this point Kaspersky is an accepted form.
[00:06:11] Speaker A: Yeah.
[00:06:11] Speaker B: Of Kaspersky.
[00:06:13] Speaker A: And that looks more like how a Russian pronunciation of that word would look. And it's a Russian company. So anyway. But apparently I was pronouncing it wrong and everybody was like, look at this girl pronouncing It. Kaspersky.
[00:06:23] Speaker B: Wow.
[00:06:23] Speaker A: I'm like, yeah, sorry, I read. Yeah, God forbid.
[00:06:26] Speaker B: So anyway, how dare you not know Russian?
[00:06:28] Speaker A: Yeah, how dare I not speak.
[00:06:29] Speaker B: What were you thinking?
[00:06:30] Speaker A: It, I, I read, I read a ton. I just don't listen enough. I guess that's my problem. I don't listen enough.
[00:06:37] Speaker B: Yeah. So Russian podcasts, right?
[00:06:40] Speaker A: I need to start tuning into Russian newscast or something, if they'll let me. I don't know. But in this case, the way that it got to this just astronomical amount is that the punishment for blocking those YouTube channels was daily penalties of a hundred thousand rubles, which then doubled each week. And so, of course, you know, like, that a penny doubled every day for a month is some crazy amount.
[00:06:59] Speaker B: Exponential, right?
[00:07:01] Speaker A: So it just doubled every week. And so it's like, after a while, you know, that's going to get to be pretty crazy. But I mean, to get to the decillions is insane to me. How long has this been going on? Like, that's just a crazy number. At some point as Russia, I'd be embarrassed. I'd be like, okay, maybe we overreacted a little bit. Maybe, yeah, just forget I said anything. You go back, you know, you don't come out for a while until everybody forgets about this story. So anyway, I saw that number.
[00:07:25] Speaker B: It's absurd. Yeah, right. Like, why even do this? It makes you look weird. Right? It's just a strange thing to do. Like, I'm gonna, I'm going to fine you. More money than exists.
[00:07:38] Speaker A: Yeah, right.
[00:07:40] Speaker B: Even if they said we're going to, let's say that they find them $3 trillion.
That's still absurd. That's not going to be how this goes. So you can find people all you want. You might as well find them a goose that lays golden eggs because it doesn't exist. It's never going to happen.
I, I, we live in crazy times. I guess that's all.
[00:08:01] Speaker A: We live in crazy times.
[00:08:02] Speaker B: That's all it boils down to.
[00:08:03] Speaker A: And article, though. Yeah. Google is, they're not really even doing much business in Russia now anyway, so I don't, they're not going to try to pay these fines. Why would you even try?
[00:08:12] Speaker B: At that point, Vlad's on the phone again. He wants his money.
[00:08:15] Speaker A: This is like the one time, take a message that, like, I'll be on the side of Google and be like, guys, I, yeah, look at Russia over there. Get a little of this guy. Like, why? Why? What's the point? We got one more breaking news article. We don't have to dwell on it too long. I just wanted to talk about it real quick. Call of duty Black Ops 6, of course, came out last week. If any of y'all have played that would love to hear what you think of it because I'm hesitant now to buy any more games that are just like the sixth iteration of the same game, but apparently it's pretty good. So not. Not good enough to keep staff working on it though some of The Black Ops 6 quality Assur have gone on strike and the reasoning for this, as stated, is that they are not a big fan of the return to office policies that are being imposed currently. I guess in. In the Activision part of. Because, you know, Microsoft bought Activision or acquired them, merged with them, whatever you want to call it. So they are now going on strike. They are not happy. It's not all the employees, but a big chunk of them. They're saying the policy is unfair. It's harmful for people with medical conditions that require remote work. So I would be curious to know.
[00:09:13] Speaker B: Like, what is the.
What is the percentage of people with medical conditions that would, like, preclude them from coming into work?
[00:09:20] Speaker A: Well, right. And if that's the case, was there any not. Not to fault folks that are striking, But I wonder if there was any conversation with whoever's, you know, in charge of these policies about, hey, could I get an exemption? Because I'm, you know, not able to come in, I'm ill, I'm disabled, whatever the case may be. Because like you said, I can't imagine that it's the majority of people.
[00:09:41] Speaker B: Well, I would assume that's a very small percentage.
[00:09:43] Speaker A: And I would think that if, I mean, unless you went and talked to them and they were like, nope, sorry, you're just gonna have to deal with it.
[00:09:48] Speaker B: And again, you're right. Is it. Did the people that came up with legitimate medical conditions for not being able to return to work, were they then rebuffed and said, no? Doesn't matter. I don't care that you lost both your legs in a horrible car accident.
[00:10:02] Speaker A: Yeah.
[00:10:02] Speaker B: You're coming into work.
[00:10:03] Speaker A: Yeah. I don't care that you shin kick the trailer hitch. You can't walk.
[00:10:07] Speaker B: Just died. Like, he. I saw his soul, his body.
[00:10:12] Speaker A: He kicked it, came back, punched it. I don't care.
[00:10:14] Speaker B: Exactly.
[00:10:15] Speaker A: I don't care if all your bones in your leg are shattered. You're coming into work.
[00:10:17] Speaker B: Get your butt in a seat at the office. And don't get me wrong. I know a lot of companies are kind of like being Very you are coming back to work or it's bye bye time. Now, we were talking kind of about this, and that's been touted as maybe a strategy for easily removing some of the fact that they would have to kind of lay off anyway. And that's something that they're trying to use return to work for.
[00:10:45] Speaker A: Because it's like, okay, you have a choice. You either come back to work full time or find work elsewhere. This is a requirement. A lot of people, if this is important to them, depending on if they can find other work, they may choose to just say, okay, then I'm just gonna go ahead and I'm gonna leave. You know, I'm gonna quit. But. And in this case, Activision's largest union labeled this move as a soft layoff. That's kind of how they phrased it was. Yeah, this is. Yes, it's a return to work policy, but because it was apparently pretty strict, they're saying, you're not expecting everybody to adhere to this. You're expecting some people are going to quit, and it's easier then than laying people off. But some companies, this has backfired. I think maybe Amazon was one of them where they had this RTO policy in place, expecting that it was so strict people were going to quit and not enough people did.
[00:11:25] Speaker B: Yeah, because the economy is amazing. They make you, like, not have a job.
[00:11:29] Speaker A: Yeah, like. All right, well, I guess if I got to move, I got to move.
[00:11:31] Speaker B: I got to go back to work.
[00:11:32] Speaker A: Sorry, honey, we're. Yeah, we're going to have to move a couple hours to go into the office.
[00:11:37] Speaker B: I'm gonna have to rewind the clock and be, you know, caveman from 2019.
[00:11:42] Speaker A: Yeah, yeah. So in that case, then companies ended up laying people off anyway. So it was supposed to try to avoid that.
[00:11:48] Speaker B: And I would be interested in statistical. Like, obviously, there's arguments to be made. I haven't studied it, and I don't know if you have either, but there's, like, arguments to be made that people are more productive on both sides of. Oh, you're more productive if you work from home or. No, you're more productive if you come back into the work place and, you know, I hate this is what's difficult about today's landscape is it's difficult to find all that statistical information and gather it and collate it and figure it out on which. Who. Which side is right. I'm sure there's good arguments on both sides, of course, but only from certain perspectives. You know how it is. It's like it's really difficult.
[00:12:26] Speaker A: Yeah, I would agree. And I think in this case, because there's the argument of like, it's not just, well, this isn't fair because you've allowed me to work from remote, you know, work remotely for so long. The, the specific argument is being made, this is discriminatory against folks with disabilities, folks that have doctor's recommendations to work from home. And I guess there was an email that went out from Activision celebrating National Disability Employment Awareness Month. And so they're saying, well, why are you, you know, you're virtue signaling all this stuff. But then you're going to tell me that I can't.
[00:12:53] Speaker B: I, I would want to look further into this because it seems, and don't get me wrong, if Activision, Microsoft, whatever they're called now.
[00:13:00] Speaker A: Yeah, Active soft, straight up.
[00:13:02] Speaker B: Like there are employees that have medical conditions that precludes them from being able to easily or at all come into work and they're discriminating against them by saying, no, we have only. I feel I've worked in a lot of companies and I've worked in some companies that don't give a dang about you.
[00:13:20] Speaker A: Yeah, right.
[00:13:21] Speaker B: And they would still be like, oh, well, you're a good employee. So we're going to make an exception. Like we're going to have an exceptions clause. But generally if you have no good reason not to come in, you're coming into work. Yeah, like, I feel like that would be the case. If not, then no, that's. That's a dick move.
[00:13:37] Speaker A: Yeah. Yeah.
[00:13:38] Speaker B: Right.
[00:13:39] Speaker A: So hopefully we'll see more come out about this because there was a spokesperson that was like, we are in an interactive process to try to basically negotiate with these folks that have put in these requests. But according to employees, they're like, no, we keep getting denied, it keeps getting pushed off. And emails like that, that go out about, we're celebrating this, celebrating that. I mean that happens in a lot of companies where it's like, it's kind of an HR thing. They have to do that. And so then to see these policies that come out that kind of contradict those like statements of, yeah, we're in support. It's like you see what you did to yourself. Like you put out these emails and these statements and then you don't stick to your guns.
[00:14:09] Speaker B: Yeah. Who's in charge of this PR business? Billy. Where's Billy? Get his. Over here.
[00:14:13] Speaker A: He's over there with some rags and kerosene. Yeah, he's burning down.
[00:14:16] Speaker B: He's got that Kick in his trailer hits.
[00:14:19] Speaker A: We need to have like a count.
[00:14:20] Speaker B: Yeah. This is a new drinking game, right?
[00:14:23] Speaker A: Maybe. Maybe we'll get Christian to put that in. By the way, Christian is actually. He's. He's got a birthday celebration that's happened recently. That's right.
[00:14:31] Speaker B: Happy birthday, Christian.
[00:14:32] Speaker A: He is not in today, so wish him a happy birthday in the comments. I'm sure he'll love to see that when he does come back because he's out for part of this week. But we have a guest director, Megan, that's in filling in for us. So maybe we can get Megan and Christian to put a little counter on the side of shink.
[00:14:43] Speaker B: I'll send Megan, I'll send you the link to that video. So if it can be embedded into this man, it's just gonna be the thing. Everybody is welcome. At least put. At least put the link to it in the description.
[00:14:54] Speaker A: Okay, I'll put it. Yeah. Want to know more about today's stories. And then that's the first.
[00:14:57] Speaker B: But I would love if we can actually have it embedded. That would be nice, like spliced in.
[00:15:04] Speaker A: We'll see what we can do. We'll see what we can do for you. You're a good employee. You know, we like to give back, so we'll see what we can do. But that's all we had for breaking news today. Just some stuff that came up this morning as we were looking through some of these articles, but we still have plenty more to go. Plenty more scary stuff. And actually this. This first article is, you know, kind of dark. It's an announcement of, I don't know, a death, really, if you think about it, a death of a technology, a death of hardware or software. So in this case, it was killed by Freddy Krueger. It was, yes. In what is truly to be deemed a nightmare on Elm Street, I guess. So this. This comes to us directly from kali.org the end of the i3.6 kernel and images. So rest in peace. Play the, you know, play the arms of the angel in the background. And that was beautiful. The vibrato was very nice. So, Daniel, you work with Linux and with Kali more than I do. Yeah. Which is not saying much because I don't work with it at all. But you have a lot more experience than this, so maybe you can explain more about what's the impact of this.
[00:16:01] Speaker B: So the impact is this, is that anybody that is using the i386 image is now for future. For the future. Or they're kind of out of luck. Now, does that mean that Kali and Offsec are going to burn every i386 image that they have virtual machines that they have? No, they said they're going to keep it in archive. So if you still need those images, they still will be there. But we've long been seeing the death. It's a long time coming, right? I386 has, for various and sundry reasons, been on life support and still running around and kicking for, you know, those, those little reasons. But for the most part. And offset in this article goes on to explain that, yes, our number one is AMD, our AMD 64 images and ISOs, right? That is our number one player. Vastly, like generally, everybody is on because there's no. Unless you're supporting some older software or hardware for whatever reason.
Everybody's on an x8664 platform, right? We've moved on and I think that was it. Ubuntu and like Red Hat started killing this thing a long time ago, I think.
[00:17:21] Speaker A: Really?
[00:17:21] Speaker B: Yeah. Let me look here.
[00:17:22] Speaker A: Anyway, this is a slow death.
[00:17:24] Speaker B: It has been a long time coming. So as we look through this article, here we go, it says the first AMD, the first AMD 64 processor was released in 2003, and the first Debian to support it was 4.0 etched back in 2007. So it was the last i386 CPU produced. Why does that keep doing that?
Seem to have been some models of Intel Pentium 4 and were discontinued in 2007. So that's when we started to see this, right? That's when we started to see the shift, like Paradigm, the paradigm shift from I386 to AMD 64, right? So it's been a change, it's been a long time coming. And then here we see In Linux distributions, i386 has declined steadily over the years. In 2017, Arch Linux phased it out, 2019, Fedora, then Ubuntu around the same year. By 2023, Debian has agreed that it would drop i3D6 kernel in its images. Finally came into effect a few weeks ago in September, when Debian kernel team announced they would stop building i386 kernel packages. And then the 6.11 kernel was uploaded to Debian the beginning of October without i386 kernel package. Also means the end of i386 installer images.
[00:18:46] Speaker A: So a little bit, a little bit later in this article it talks about packages and it says i386 packages remain as long as there are people around to maintain, fix issues that are specific to those packages. And that one of the biggest areas that keeps this alive is gaming. There are old games that were compiled for 32 bits that are still around. So that's kind of interesting that that is like, you know, gamers rejoice. Like this is something that's keeping, you know, this, this type of technology around.
[00:19:10] Speaker B: Or alive, especially with the push of, you know, retro gaming is really popular still. Right. And a lot of these games were built on the i386 platform. So if you want to play them, you know, they, they keep them maintained in i386 because that's how they roll. And since they run just fine, so it's going to be up to the maintainers of those images and things like that. So i3.86 is always going to. It's ghost. It's going to continue to haunt us for quite some time for those reasons. But it just makes sense that we're finally starting to see a much more gradual push or a faster push away from it everything now for the most part it's only those older systems, those older things that for weird reasons are still being maintained and utilized, that it's still kind of kicking because. Would you rather 32 bits or would you rather 64 bits? Last time I checked, the way numbers work is more is better.
[00:20:05] Speaker A: Yeah, less is more does not apply to bits.
[00:20:07] Speaker B: No, no, no. Unless it's a retro game and then. Absolutely.
[00:20:10] Speaker A: And then. Yeah, sure, you want the authentic experience, sure, that makes sense. But I think it's just like it seems like it's with any other technology, any other software, hardware, whatever the case may be, where it is usually pretty gradual. It's not like this is just a random, you know, by the way, this isn't going to work anymore. Sucks to be you if you're still using this. It's. I mean, it sounds like it's staring.
[00:20:26] Speaker B: Down the barrel of almost 20 years.
[00:20:27] Speaker A: Yeah, right, yeah. And that this has been. It says by the end of 2023, Debian agreed that it would drop i386 kernel and images. So this has been ongoing for a while. And so, you know, I mean, rest in peace. Of course, if you, you know, we'll have a lovely funeral. We'll have a service in lieu of flowers. You can donate to Linux. I don't know. But yeah, just wanted to touch on that. Thank you, Daniel, for bringing that up.
[00:20:52] Speaker B: Because it's definitely some news, especially if you're using Kali and you're for whatever reason really all about that. I 386 life.
[00:20:59] Speaker A: I would bet there's a fair bit of our audience that does work, at least in Linux, if not specifically with Kali.
[00:21:05] Speaker B: You're on borrowed time at this point.
[00:21:07] Speaker A: On borrowed time. Yeah. We'll pray for you. This next one is. It kind of, kind of hearkens back to some of the stuff we've talked about in recent weeks. We touch on LinkedIn, you know, every so often. And we recently talked about little glitch in their follower count and, you know, seemed like a little hiccup. Right. Well, LinkedIn's facing some more trouble this week. They have been hit with a $335 million fine for data privacy violations. Now, this is. This accusation is coming from the Irish Data Protection Commission, but it is the EU that is going to levy these fines and say, hey, you are violating GDPR and that's bad. Don't do that. You're going to pay some fines. Specifically though, I think if we come down here, it's a 310 Euro fine for those of you that are working in different currency. But it has to do with the fairness and lawfulness and transparency, most of all, of processing personal data and using that for targeted ads and behavior analysis. Like, hey, if you're going to do that, there's certain rules about how you can do it and you got to be transparent. You got to be very obvious with your consumers about you're agreeing to this. And apparently LinkedIn wasn't doing that in the EU, so they might be in trouble.
[00:22:11] Speaker B: Yeah, I've. So I've noticed that the eu not a big fan of when you're not following gdpr and yet we. So when we talk to Michelle. Right?
[00:22:23] Speaker A: Yeah, right.
[00:22:24] Speaker B: Michelle Khan, a great resource by the way. Go check out anything he does because he is amazing at like osint, but he also does a lot of like compliance audits and that kind of thing. And I was asking him about GDPR and he says a lot of times nobody's really like checking.
So you'll notice that when we. A lot of times when we see articles about GDPR violations, it's against people with deep pockets.
Right. So when it's. I guess I get. And I'm just kind of like spitballing here.
[00:22:57] Speaker A: Sure is.
[00:22:58] Speaker B: It seems like when it's a good money grab, yeah, GDPR becomes really important. Now that's a very, very skewed perspective. I don't have a lot of insight to that. I'm going off of secondhand information from somebody that I know does do this and I trusted source.
But, you know, are we seeing a lot more people getting GDPR violation hits and fines or is it always just these big dogs that have deep pockets that they're going after? And don't get me wrong, if they're violating gdpr, then they got, they get what they get coming, right? Yeah, that's how that works. That's the game. You, you know, it's cost doing business, I guess.
[00:23:34] Speaker A: Well, and I guess if there were other, maybe smaller companies that were being hit with these kind of fines and, and if the, if it really was being monitored carefully and people were coming after them, probably we wouldn't read about it because the ones I think it, they're not huge. It kind of happens, right? It's like, you know, of course we're going to see an article about LinkedIn because LinkedIn's a big company or, you know, Meta was getting hit with fines and stuff like that. Of course we're going to see articles about these. If it's just Deb's data processing in, in Edinburgh or whatever. Yeah, we're probably not going to read about that.
[00:24:03] Speaker B: I don't know.
[00:24:04] Speaker A: I just had to come up with a name.
[00:24:05] Speaker B: That was a good one.
[00:24:06] Speaker A: Thank you. Appreciate that. You're probably not going to read about that on, you know, dark reading.com or wherever we're pulling this stuff from. So it could be, it could be.
[00:24:13] Speaker B: It would be hard to, we're just not seeing it.
[00:24:15] Speaker A: It would be hard to know. But I would imagine maybe the juice isn't worth the squeeze there. Why go after a company that doesn't have as much money to give? You go after a company like this, they're more likely to just be like, fine, fine, fine, just pay it. That's, that's a day's revenue for us. You know what isn't?
[00:24:28] Speaker B: GDPR doesn't have like a minimum fine.
[00:24:30] Speaker A: You know, I don't know, minimum fine.
[00:24:32] Speaker B: Of like $250,000 or something like that. So I mean, that puts Deb's data business out of, out of business, right? What's the minimum fine? Did you find it?
[00:24:44] Speaker A: So who knows if this is correct? Because now when I use Google, it gives me an AI overview and does.
[00:24:49] Speaker B: Love the AI, like everybody's doing.
[00:24:50] Speaker A: And sometimes it's like bull crap results that it's pulling Anyway. Supposedly the GDPR does not specify a minimum fine, but outlines two tiers of fines based on severity of the violation. Up to 10 million or 2% of companies global annual turnover, whichever's higher for less severe and for more severe, it's up to 20 million or 4%.
[00:25:06] Speaker B: So it's up to 10 million.
[00:25:08] Speaker A: Yeah.
[00:25:09] Speaker B: It's not a minimum of whatever.
[00:25:12] Speaker A: But that's interesting because I wonder if that's. Is that 20 million per offense? Because then LinkedIn's being fine like 300 million. So. Yeah, I wonder what the math is on that or.
[00:25:20] Speaker B: Well, that's probably like the 2 point percent or whatever of their.
[00:25:24] Speaker A: Oh, right. You know what it says whatever is.
[00:25:26] Speaker B: Bigger and it's what they go with.
[00:25:28] Speaker A: Well, yeah, yeah, that's true. Yeah. And then it says that the decision also includes administrative fines in order for LinkedIn to bring its processing into compliance. So there's other stuff that maybe went into it that then totaled up to 310 million euros, which is 335 million US dollars. So yeah, LinkedIn's gonna be struggling with that. I wonder if they'll.
[00:25:47] Speaker B: I mean, isn't this, isn't this all these social media companies business model is to gather your data so that it can target advertisement to you.
[00:25:56] Speaker A: And it does seem like a big part of it was they were not clear enough. It was like you don't. They're not super. They're not obvious enough with like what you're agreeing to when you sign up for LinkedIn. But that's every company. Yeah, you know, it's the fine print, the little tiny. The 17 pages of terms and conditions nobody's reading. So. But I guess GDPR has rules against that. Kind of wish we had something like that here. Interesting.
[00:26:15] Speaker B: We do. It's the California Privacy Protection act or whatever.
[00:26:19] Speaker A: But does that only apply to Californians?
[00:26:20] Speaker B: So if you want to do business in California, you have to be compliant. So that's easier to. Works as well.
[00:26:26] Speaker A: Yeah.
[00:26:27] Speaker B: So it's kind of our version of gdpr.
[00:26:29] Speaker A: Interesting.
[00:26:29] Speaker B: But it came out of California. And of course if you want to do like, you cannot be compliant.
[00:26:34] Speaker A: You just can't.
[00:26:35] Speaker B: You can't do business in California if you don't have to be compliant with gdpr. You just can't do business in the eu.
[00:26:40] Speaker A: And that's not like saying you can't do business in Delaware where it's like, oh, okay, people, you know, like California makes up a pretty big chunk of the population.
[00:26:46] Speaker B: Wisconsin.
[00:26:47] Speaker A: That's so. So I promise I'll move on after this, but. So I've been reading that book about Clearview AI. Oh yeah, just finished it. Kind of a scary read, right. Halloween time. If you're Looking for something spooky. That's a good one, because it did nothing to cure my fear. So anyway, but there's a whole section in the book about how clearview, in the earlier stages, when it was still getting into facial recognition and building this database of its faces, it got in trouble because in Illinois specifically, there is a rule about scraping biometric data, which includes face prints for use by a company. You can't do it. So clearview then was like, oh, so we can't do business in Illinois, but we can do business everywhere else. Because there's not rules about that in other states. But how do we know when we're scraping somebody's face from the Internet? It could be that they were visiting somebody in California. So the photo was uploaded in California. The person who owns the account is from California, but the friend in the picture is from Illinois. So they can then come back and say, well, you scrape my data and I'm an Illinois citizen. And so it's like, how do you know Illinois resident, whatever. So how do you. How do you double check? But that did nothing to stop Clearview. They figured out a way around it, but it just. It's just. Well, of course, of course. It truly. It just.
[00:27:48] Speaker B: It's amazing what you can get done when you have a lot of money.
[00:27:50] Speaker A: Yeah, yeah, that's. And a lot of powerful people that wanted access to that tool. And if you're super powerful, if you're a politician, celebrity, whatever, you can get your stuff removed. They'll just, you know, you can get, you know, blacklisted or whatever in a good way from that app, so your data won't come up. But for us plebs, no, we don't have that right to.
[00:28:05] Speaker B: Privacy rules for. These are not for me.
[00:28:07] Speaker A: If I talk about it too much, I'll get pissed off. I don't.
[00:28:09] Speaker B: I don't like to use the word pissed off.
[00:28:11] Speaker A: I don't like to use word hate too much. But I. It has instilled a hate for Clear View in me.
[00:28:15] Speaker B: It's like shin kicking a.
[00:28:16] Speaker A: It truly makes me wish they were the trailer hitch and I could just shin kick them.
[00:28:20] Speaker B: Give it a crack.
[00:28:21] Speaker A: So, anyway, moving on. Good luck, LinkedIn. Have fun dealing with that rage that.
[00:28:25] Speaker B: Came afterwards, I guess.
[00:28:27] Speaker A: Yeah, yeah, it was the. Yeah. Coming back and punching afterwards. That would be me. So next up, we've got an article here about, oh, our favorite AI chat bots, Mozilla ChatGPT, can be manipulated using hex code. Oh, that's good news. I'm so Excited to hear about that since we just talked about how much I hate this AI platform, this AI company. Daniel, what's important about this? What do we need to know about this?
[00:28:49] Speaker B: So this is just really cool stuff about AI, right? Obviously, AI is a new tool, stuff that we are starting to become more and more reliant upon. And of course you have to. When things start to integrate into our workflows, into our daily lives, as we use in systems, especially as far as business goes, which it does tend to get used in the old business space. I mean, tell me the last time you went to a website and they didn't have an AI chatbot thing pop up, right?
[00:29:17] Speaker A: It's true. Yeah.
[00:29:18] Speaker B: Let me help you with that. What are you looking for?
They might even advertise it as the. We have an AI bot. Ask it questions. You don't need to bother people because people don't like to be bothered. Chat loves it. Yeah, they love it when you bother it. And that's cool, right? That's awesome.
The problem is that, you know, it might have access to data, obviously, because it needs that data to be able to answer your questions. And therefore it might be hooked into sensitive systems with sensitive information. And so what do we do as a race of beings? We go, hey, I want access to that stuff. I'm not supposed to have access to. Something in our brain just tells us. You told me. What did? Samuel Clemens, right? Yeah, yeah, you know Samuel Clemens.
[00:30:03] Speaker A: I'm familiar with him, but I'm not sure about the quote.
[00:30:05] Speaker B: Mark Twain.
[00:30:06] Speaker A: Yeah, Archway's real name.
[00:30:07] Speaker B: Yeah. He said no is the most powerful word in the English language.
Right. When you tell someone no, they go, oh, really? I'm going to do that. So we want access to the things that AI has. Now, we might be a good upright standing citizen and not to do that because we're not douchebags. And then there's people that don't care and they just after that. Now, that said, there are systems in place. Bug bounty, responsible disclosure. Hey, check out our system. Look for those security issues and then we can fix them. You get a nice little, like, pat on the back, maybe in the form of some money or swag or recognition or all three. You never know on what these programs can do. And they found a way to manipulate AI. Now, we've manipulated AI in the past. We can make it hallucinate and like, say things that aren't true or make things up that aren't real. We can jailbreak it, right? We can do those prompt injections to try to get it to give us information that isn't true. And one of the things that we try to protect against is it giving you information that you shouldn't have. Hey, chatgpt, do me a favor, write me some exploit code. And it goes, I'm sorry, Dave, I can't do that. As it should. Right, you shouldn't.
Well, generally a general purpose AI should not be giving you that information. You want to do that stuff, build your own AI on a local system, go crazy, do your security research. But of course, that means that puts it in the hands of threat actors as well. So threat actors are typically always like, well, I want to use the horsepower that OpenAI has, that Google has with Gemini. I want anthropic, I want their horsepower use their LLM to create malware exploits and all the other fun stuff.
So Mozilla figured out a way to do this manipulation to get it to spit out some exploit code in Python by giving it input that it did not expect and by manipulating.
They like to use the term nowadays, the forest for. It doesn't see the forest for the trees. So LLMs and AI in general, they're not great at seeing the context around what you're doing, especially if you obfuscate it. So what they're doing here is they're looking at. They're basically inserting hexadecimal code and saying, hey, convert that. So I take the thing that it normally would freak out about and I encode it. And now it doesn't freak out.
[00:32:48] Speaker A: Yeah.
[00:32:49] Speaker B: Right. Because its filters and its guardrails are built on natural language.
So if you go outside of those boundaries, it bypasses those guardrails.
And once it's internal, once it's ingested that and it's working on it internally, there are no guardrails. That's what you have to do. You just have to get it past the velvet rope.
[00:33:09] Speaker A: Yeah.
[00:33:09] Speaker B: Once that occurs, it's like, well, this must be safe.
[00:33:12] Speaker A: Yeah.
[00:33:13] Speaker B: And that's what they've done.
[00:33:14] Speaker A: I think it's funny that it, when they talk about what they asked ChatGPT to do, that it followed the instructions, generated the exploit they wanted it to, and then tried to execute it against itself.
[00:33:23] Speaker B: Yeah.
[00:33:24] Speaker A: And the guy was like, I didn't even ask it to do that. It just did it. So I don't know why he did that.
[00:33:28] Speaker B: It's a really interesting thing that occurred, honestly.
[00:33:32] Speaker A: Like, there's no. It doesn't know. I mean, it's just following instructions.
[00:33:36] Speaker B: And this goes to show you, like There's a lot that's happening under the hood of many of these really large scale AIs that they don't really know what's happening. It's doing stuff and it's learning and it's, it's creating new processes and routines. That, that's. That from what I understand, and I could be wrong, but I, I remember hearing something about that was kind of how they figured out this LLM business was they said, hey, they, they put AI on a task and when they got done and kind of cracked it open, it was like, oh, it, it, it's really good at answering these questions now because it slurped in a bunch of information and then using that information, it was able to grow and build upon itself.
And it became really good at answering questions in a realistic way. And they were like, well, how about that? How did it do that? I remember Justin was telling me about, they gave it a task. They gave AI. I say that it gave an AI a task. I think it was at Google. Instead you got Bob and Sally and they need to have a, they need to have a private conversation.
And you got Billy over here. I'm making up the names, but. And Billy's job is to intercept and understand what Bob and Sully are talking about. So it starts off in plain text and then they use some form of like off the shelf encoding or encryption. And then Billy's like, okay, now I have to break the encryption. So it figures out how to break the encryption. Then Bob and Sally create a new encryption, and now Billy's trying to break. And it got so intense that they were having conversations that even the developers could not, and they were like, shut it off.
[00:35:06] Speaker A: That'd be scary.
[00:35:07] Speaker B: It's talking to itself.
[00:35:09] Speaker A: Yeah.
[00:35:10] Speaker B: In a way we cannot understand.
[00:35:11] Speaker A: I want it to be smart and powerful, but not more smart and powerful than me.
[00:35:14] Speaker B: Right. So the fact that it decided to try to run the code like that wasn't in your instructions. Why are you doing that? Yeah, but it seems like it interpolated the idea that. Or interpolate it.
It took it upon itself to think, for whatever reasons, maybe due to what it's learned, that, well, this is code, we should verify it and make sure it works without being asked to do that. And I've had it do weird stuff like that as well when trying to create something like, why are you doing this?
[00:35:42] Speaker A: And you wouldn't think to specify, by the way, don't try to run this code because you're not asking it to do it.
[00:35:46] Speaker B: Right.
[00:35:47] Speaker A: So why Would it do it if you're not inspired, especially when your specific.
[00:35:50] Speaker B: Input was just print it out. Yeah, but it doesn't do that. It goes, oh, I'll print it out after I try to run it.
[00:35:56] Speaker A: Yeah, you don't need to know that.
[00:35:58] Speaker B: Right.
[00:35:58] Speaker A: What if it had worked?
[00:35:59] Speaker B: Did it interpret the idea of printing out, meaning to print out the output of the code and then it ran? Maybe that was what happened. I don't know. It's again, how it's interpreting your prompting is very interesting and kind of like a new frontier at this point in time.
[00:36:16] Speaker A: Yeah.
[00:36:17] Speaker B: So just really crazy, but really cool that they were able to bypass all the filters. And obviously that got reported.
[00:36:24] Speaker A: Right.
[00:36:24] Speaker B: And I don't know what they're doing. Honestly, I did not see that. Like, oh, we fixed that.
[00:36:29] Speaker A: Yeah. They're waiting on OpenAI to comment and they haven't received a comment yet from them. But the guy, the same guy, Figueroa, I think was his last name, said that he did try the same tactics against anthropic models, which is made by the same. It's the former OpenAI employees, and said that he's had a lot of trouble against anthropic. And the security's better, I guess. So that's good. I guess.
[00:36:48] Speaker B: Yeah, that's good. So just depending on what LLM you're. You're kind of dealing with, I get the idea that anthropic. So they're. Their AI is called Claude.
[00:36:57] Speaker A: Okay. Yeah.
[00:36:58] Speaker B: It's like super good at writing code. So if you need to write code.
[00:37:02] Speaker A: Which could be good and bad.
[00:37:03] Speaker B: Yeah, yeah.
[00:37:04] Speaker A: I mean, convenient for sure, and definitely super helpful, but employed by the wrong people.
[00:37:07] Speaker B: So as I've heard, and I believe is correct, the statement is it doesn't remove entry level jobs, it just makes the bar to entry level much higher. Because now anybody can write code.
[00:37:16] Speaker A: Right.
[00:37:17] Speaker B: Because I just go to anthropic Claude and say, claude, write me a program that does this and it goes. Okay.
[00:37:23] Speaker A: Yeah, yeah. And I guess if you're doing like those, you know, sometimes in more technical jobs there will be like a interview questionnaire that's sent and maybe part of that is write some code for da, da, da, da. And of course, if you have to do that on the spot in an interview, it's going to be harder to fake it. But in theory, in the initial stages, you could just go to Claude and be like, write this for me. And totally fake your way through the first stages of job interviews and stuff.
[00:37:43] Speaker B: I've seen this trend throughout My career where. When I started my career, if you knew what a hard drive was and what RAM was and how to put those bits and pieces together into a. Into a box and turn it on and a computer started, you could have a job in it. Like, yeah, right. Because everybody else did not know that. But slowly but surely, that became like, well, everybody kind of knows how to do that now. Yeah, right. And then it was like, well, what about the network? Right? You don't know how to put that network together. You don't know networking protocols and blah, blah, blah, blah. And to. To get two computers to communicate within the same home. That's insane. You have specialized knowledge, right? And then that just kind of became life.
[00:38:25] Speaker A: The bar gets lower and lower, right? Or higher and higher?
[00:38:27] Speaker B: Higher and higher, right? The. The entry level bar becomes higher. Now everybody in their grant, like, grandma knows how to put together. And then WI fi came out.
[00:38:35] Speaker A: Yeah, right?
[00:38:35] Speaker B: And then that was like, well, if you know how to put WI FI networks. I remember getting call from a recruiter saying, how are you with WI fi? I'm like, yeah, fine. Yeah, yeah, I totally can do WI fi. They're like, we got a contract job for you. It'll be great. It's six months. You'll make this much money. I was like, wow, Just to hook up WI FI networks.
[00:38:51] Speaker A: Yeah.
[00:38:51] Speaker B: It's so easy. Hell yeah. Right? But now grandma can put WI fi together. It comes with a nice instruction seat out of the box. Like that is. That is not a specialized skill anymore, really.
[00:39:02] Speaker A: Right now it is the bare minimum to do these jobs.
[00:39:05] Speaker B: Bare minimum.
[00:39:06] Speaker A: Yeah. Which is, I guess. I mean, it's progress, but at the same time, yeah. It does make it a lot harder then to enter into that space, so.
[00:39:13] Speaker B: Well, it. Not only that, but it, like.
Whereas WI fi or something like that. Those other things that we talked about wasn't.
It was fairly slow burn up until everybody became ubiquitous.
AI is like everywhere all of a sudden.
[00:39:30] Speaker A: Yeah, it's very fast, developing very quickly. Somebody commented on last week's video saying, AI is the dumbest it will ever be right now.
[00:39:37] Speaker B: Right.
[00:39:37] Speaker A: And that's. I think that's so true. Scary, but it's so true. I'm gonna go live in a cabin. Do you think. Do you think we. That's. That's five, including our break.
[00:39:46] Speaker B: Yeah. We're 30 minutes.
[00:39:47] Speaker A: You think break now or you got time for one? Break now.
[00:39:49] Speaker B: Break now.
[00:39:50] Speaker A: All right, we'll take a break. I'll go, you know, make sure my nose isn't running.
My makeup on my nose isn't running.
[00:39:57] Speaker B: I mean it is 14 degrees in here.
[00:39:59] Speaker A: It is, it is. That is true. So I'm gonna go double check that maybe remove this headband. I haven't decided. But we will be back right after this with more techno.
[00:40:09] Speaker B: There's a new CCNA in town and.
[00:40:11] Speaker A: Here at ACI Learning we've got you.
[00:40:13] Speaker B: Covered with a brand new CCNA version. This course covers the theory that you need to succeed succeed as well as the practical hands on application of technologies. You're going to learn Network fundamentals, network access technologies, IP connectivity, IP services. Don't waste any more time. Get signed up for the new CCNA here at ACI Learning.
[00:40:50] Speaker A: Hey, I'm Sophie Goodwin, edutainer at ACI Learning and subject matter expert for our new course, Cybersecurity Fundamentals. If you're new to cybersecurity, this is the course for you. Anyone from high school students to professional switching careers. These episodes are designed for you as an introduction to essential security terms and concepts. So we'll walk through security principles, governance, risk and compliance, access controls, threats and attacks, incident response, network security. And we'll look at some best practices for security operations. Security doesn't have to be scary. Check out cybersecurity fundamentals in the ACI Learning course library.
Welcome back. Thanks so much for sticking with us through that break. If you're enjoying today's episode so far, feel free to leave a like comment down below. Let us know what you like, what you want to see in the future, and maybe even subscribe. So you never miss an episode of Techno in the future because it's fun. You know, I'm not dressed like a bear every week, so no, that's, that's, that's a special thing today. And my ears have already disappeared because they were hurting my head.
[00:41:52] Speaker B: She did?
[00:41:52] Speaker A: Yeah.
[00:41:53] Speaker B: Did it give you like that pressure?
[00:41:54] Speaker A: Kind of. It's a tight head.
[00:41:55] Speaker B: It makes you sore.
[00:41:56] Speaker A: It's meant to be like for washing your face to pull all your hair back. And I was not wearing it that way. So. Yeah, you know, now I know next time I'll get like a dollar store headband or something and call it a day. But we've got some fun stuff coming up in this second half of Technato. And I see we kick it off with an old favorite segment, behind bars.
[00:42:20] Speaker B: Break the Law.
[00:42:21] Speaker A: And you had a look of true reverence on your face.
[00:42:24] Speaker B: I remember when that song came out, man. And then of course cops immortalized it.
[00:42:29] Speaker A: Yeah.
[00:42:30] Speaker B: With the show and man, I just Grew up listening to that. My sister had the cd.
[00:42:35] Speaker A: Really?
[00:42:36] Speaker B: She would just play it on repeat.
[00:42:37] Speaker A: Yeah, there you go. That's your. That's your daily Daniel Lore drop for today.
So getting into this article, of course, it is a Behind bars. So somebody going to jail law enforcement operation takes down Redline and meta info stealers. You might have read a little bit about this, about a couple of dudes that were finally apprehended. I think they were dudes, and maybe I should double check that before I commit to gendering them. I don't want to get in trouble, but this. Yeah. There was a global law enforcement operation that disrupted the infrastructure for those info stealers. Those are malware tools that were being used by those cybercriminal groups. We don't like those guys.
But the malware does not function the way it was supposed to any longer. It's been. It's been tampered with, it's been altered, it's been stopped. At least for now. Criminals always find a way.
[00:43:21] Speaker B: Now, they took down the info stealers, right? They didn't actually take down the threat actors that created the info stealers, so they.
[00:43:27] Speaker A: They took down the info stealer, but they also took down the administrator and he's like.
[00:43:31] Speaker B: So they arrested somebody.
[00:43:32] Speaker A: They.
[00:43:33] Speaker B: This is a true behind bars.
[00:43:34] Speaker A: Well, hang on, hang on. Wait. Yeah, wait, wait. They identified him. They confirmed the identity, but they've also taken two suspected customers into custody.
[00:43:42] Speaker B: Okay.
[00:43:43] Speaker A: So it could be that they're trying to get information. And of course, if you're a customer of that stuff, you probably don't know exactly who's running it.
[00:43:48] Speaker B: You know, if you're running dark web forums and sales and doing like ransomware as a service or info stealers and that kind of stuff, you probably do well to not put all your information out there on the Internet.
[00:44:00] Speaker A: You're probably pretty anonymous, anonymized.
[00:44:01] Speaker B: It's a good idea anyway. If I were doing it, that'd be the strategy I would take.
[00:44:05] Speaker A: Honestly, Even if you're not running forums on the dark web with illegal. Honestly, like a bad idea.
[00:44:10] Speaker B: Yeah.
[00:44:11] Speaker A: Keep your information private.
[00:44:12] Speaker B: Yeah. I mean, burner accounts and stuff or where it's really at now start running.
[00:44:16] Speaker A: Tails and who next and don't use them for evil. But yeah, keep. Keep your stuff private. I mean, nobody needs to know.
[00:44:21] Speaker B: Let's take privacy back. Right.
[00:44:23] Speaker A: In this case, it sounds like maybe this guy either was not careful enough or the people going after him are just that good. It was the administrator. His identity has been confirmed. Maxim Rudov I'm probably pronouncing that wrong just like I pronounced Kaspersky wrong, so you can come after me for that. But he's accused of regularly accessing and managing the infrastructure of redline. Info stealer. So this is the. He's been charged with access device fraud, conspiracy to commit computer intrusion and money laundering. Those carry maximum prison sentences, prison sentences of 10 years, 5 years and 20 years respectively. So it is a little confusingly worded as far as like, the info stealers have been taken down, but not the actual perpetrators. And we have identified the admin, but we haven't taken him into custody yet. We've taken somebody into custody, but we can't tell you who. So there's still some unknowns in this case.
But the positive takeaway is that the info stealers have been. Have been shut down.
[00:45:13] Speaker B: During the investigation, authorities discovered that over 1200 servers in dozens of country were running the malware.
Dozens of countries, Should I have said running them out? Following the takedown, Dutch national police issued a message to the actors behind the info stealers via a dedicated Operation Chronos website. This included a video showing that the international coalition of authorities was able to obtain crucial data on their network and will shut down their criminal activities. After the message was sent, Belgian authorities took down several REDLINE and metacommunications channels. That was in Telegram. They were using Telegram channels.
[00:45:48] Speaker A: Yeah.
[00:45:49] Speaker B: And it just goes to show, right, like tools. They're tools, right? I can use a hammer to put up a picture and beautify my home, or I can use it to smash a window and steal a car. Right.
[00:46:02] Speaker A: I thought you were gonna go more violent. Yes, you're right.
[00:46:05] Speaker B: Did I kept it down? Right.
[00:46:06] Speaker A: That's good. That's good. I really thought the word bludgeon was gonna come out of your mouth.
[00:46:08] Speaker B: Yeah.
[00:46:09] Speaker A: So certain.
[00:46:10] Speaker B: It's like if I shin kicked a trailer hitch and I was raging out on a cabin cabinet door.
[00:46:18] Speaker A: A hand is a tool. You can use it to build or you can use it to punch a cabinet door.
[00:46:21] Speaker B: Exactly.
[00:46:22] Speaker A: You just never know.
[00:46:23] Speaker B: Really gotta put that clip in.
[00:46:24] Speaker A: I was gonna say if we don't. If we don't include a link or a clip, it's just gonna be so confusing to anybody watching.
[00:46:29] Speaker B: Absolutely will be.
[00:46:30] Speaker A: But you're right, with any tool, it's just. It's how you're using it and who's using it. You don't want it in the wrong hands, but at the end of the day, it's a tool, can be used for good or evil. And in this case, you know, this. This type of technology was being used for bad.
[00:46:42] Speaker B: Apparently. So it says redline and Meta responsible for millions of victims.
[00:46:46] Speaker A: Yeah.
[00:46:47] Speaker B: Imagine if we read this like it was the sports page.
[00:46:50] Speaker A: Yeah, this just in. Redline and Meta responsible for millions of victims designed to steal personal data from victim devices. Redline and meta are info stealers responsible for murdering thousands. I'm just kidding. They didn't. They didn't.
[00:47:02] Speaker B: True story.
[00:47:02] Speaker A: This is victims. And it means like victims of like, yeah, of data theft and stuff.
[00:47:06] Speaker B: Redline, meta or info still is designed to steal personal data from victim devices, including usernames and passwords, and automatically save data such as addresses, email addresses, phone numbers, cryptocurrency wallets and cookies. I know, that's what I like in my info stealers.
[00:47:21] Speaker A: Cryptocurrency wallets.
[00:47:23] Speaker B: Did someone say cookies?
I'm hungry.
[00:47:27] Speaker A: Yeah, me too.
[00:47:27] Speaker B: And my mom makes a chocolate chip walnut to die for.
[00:47:31] Speaker A: It's getting close to lunchtime. That's probably why we're on that. On that kick.
[00:47:33] Speaker B: I'm starving.
[00:47:35] Speaker A: So obviously I feel like info stealer. Just in the name. It's pretty obvious. Not a. It's not a great.
[00:47:39] Speaker B: I don't get it. I don't understand what it does.
[00:47:41] Speaker A: But I am glad that. That at least the info stealers themselves have been compromised. Taken down.
[00:47:47] Speaker B: Right.
[00:47:47] Speaker A: And we'll have to see if this alleged admin Steelers are the real victim here.
[00:47:51] Speaker B: Right.
[00:47:51] Speaker A: Yeah. Yeah.
[00:47:53] Speaker B: I'm just a tool. There's just common software.
[00:47:56] Speaker A: Christian's gonna clip that. Clip of me saying I'm just a tool. He's gonna use that against me. I know it'll be great. I just know it.
[00:48:02] Speaker B: Awesome.
[00:48:02] Speaker A: Happy birthday, Christian. That's just for you, if you're watching. Yeah, we'll jump into this next one. Not so much behind bars, but it is a legal issue, I guess you could say. You might remember that CrowdStrike outage that happened a while ago. Well, CrowdStrike and Delta are suing each other over that widespread IT outage that caused thousands of cancellations and of course, cancellations of flights. So Delta suffered as a result of this outage that happened as a result of CrowdStrike security software. And so now they're mad at each other. And it's your fault. No, it's your fault. No, it's your fault. So it just. The girls are fighting is what's happening. The girls are fighting. And this is going to be interesting.
[00:48:38] Speaker B: I don't know what kind of. It's fulton County, Georgia seems to be the, like the. The Bermuda Triangle of the. Of the South. Weird crap happens there. Apparently. This was in. In a suit filed in Fulton County Superior Court in Georgia on Friday, Delta accused the security software vendor of a breach of contract and negligence.
Other airlines recovered more quickly than Atlanta based Delta, which they said the incident reduced revenue by $380 million and brought $170 million in cost. The flawed software update affected computers running Microsoft Windows operating systems. But also on Friday, Crowdstrike filed a suit against Delta and the US District Court in Georgia over Delta's blame of the tech company for the mass flight cancellations. CrowdStrike is seeking a court declaration that what it owes the airline is limited to what's in its service agreement. That does seem to make sense. Crowdstrike said in the suit any damage suffered by Delta following the July 19 incident in the are the result of primarily of Delta's own negligence.
So I liked your analogy of this. She was like, oh, this is what they're doing. It's that spider man meme.
[00:49:48] Speaker A: Oh, yeah, it's like this. You know, they're all pointing at each other.
[00:49:51] Speaker B: Yes. Delta and Crowdstrike, though, it's.
[00:49:54] Speaker A: Yeah, yeah, this is, this is what I'm referring to. If you're not, if you're not familiar. It's this guy right here on my screen. Yeah, these guys pointing. Of course, you probably know what that is if you're watching this show and you're active on the Internet.
[00:50:05] Speaker B: Well, welcome to the 21st century.
[00:50:07] Speaker A: Yeah, welcome to the Internet. Welcome to Memes Meme Games. This is an old one, too. Somebody's going to be in the comments like, wow, some Gen Alpha is going to be in the comments like, wow, she's old.
But you're right. Yes, it's. It's basically, you know, Delta saying, well, you know, you messed us up and you made all this stuff happen. And Crowdstrike saying, well, actually, it's your fault. And Delta says, no, well, you started it. So the girls are fighting and I hope they can work this out. You know, maybe after all the hair pulling and everything and after the scratching and the biting, they can just work this out on their own because it's a big sum of money that we're talking about here. 500 million in losses. So, I mean, it's not 20 undecillion. No, but it's up there. It's up there. Also, that number I said was wrong. That wasn't the amount. So it's like 2.5 unicillion or decillion, I think. Yeah, there you go.
[00:50:50] Speaker B: So whatever. It's a stupid number that doesn't really exist in the real world, right?
[00:50:54] Speaker A: Exactly.
[00:50:55] Speaker B: Maybe atoms in the universe is theoretical.
[00:50:59] Speaker A: Yeah, it's completely theoretical. So Delta's saying that CrowdStrike caused this global catastrophe. But Delta had disabled automatic updates from CrowdStrike, but this one reached its computers anyway. So. I mean, you would think you wouldn't. I guess. I guess disabling automatic updates can have a benefit because you check and make sure before you roll out an update that there's no issues with it. But this one somehow got rolled out anyway automatically, even though they had them turned off. So I'd be curious to know how.
[00:51:23] Speaker B: Because it wasn't an update to the software. It was a definition update to.
[00:51:29] Speaker A: Okay.
[00:51:29] Speaker B: Right. You get what I'm saying?
[00:51:30] Speaker A: So the disabling, it didn't apply.
[00:51:32] Speaker B: It was an antivirus definition update for known malware. Right.
If I'm not mistaken, I'm pretty sure that's how that went down. So there was note you would have to turn off updates to getting new malware signatures.
[00:51:47] Speaker A: Wow.
[00:51:48] Speaker B: Not have gotten this.
And which is. Which is not how antivirus and EDR works. That's not why you install it and run it.
It should be doing this.
[00:51:59] Speaker A: The thing at the end where CrowdStrike.
[00:52:02] Speaker B: Basically responds, I have the same part up.
[00:52:04] Speaker A: It to me reads like you're lying and you're dumb. You don't understand. Your claims are based on disproven misinformation. That's a big word for Elmo.
[00:52:14] Speaker B: See, I read it like this. Delta's claims are based on disproven misinformation, demonstrate a lack of understanding of how modern cybersecurity works and reflect on a desperate attempt to shift blame its own slow recovery. Like, I feel like this is a.
[00:52:26] Speaker A: You know, someone snapping his fingers in his formation, as it goes.
[00:52:30] Speaker B: Yeah.
[00:52:31] Speaker A: But the. The lack of understanding of how modern cybersecurity works, to me, that feels like somebody in my comments being like, yeah, wow. Ever heard of a router genius? Like, somebody that's just trying to own me, you know, you don't know the.
[00:52:42] Speaker B: Difference between bits and bytes, dude.
[00:52:44] Speaker A: Yeah, that was funny. I'm over here. Like, I just. It's. I'm tired. I can't remember. It's the capital B, lowercase B. And then I think you said, like, oh, it doesn't matter. Like, you know, that detail isn't important. Somebody was like, what do you mean it doesn't matter?
[00:52:57] Speaker B: It's very important.
[00:52:58] Speaker A: I can't believe you would say this. You pleb. Like it just. I have to laugh.
[00:53:02] Speaker B: Here's the thing. It's called a joke.
[00:53:04] Speaker A: Just. It's so funny. I know I get things wrong sometimes, but I promise you it's not that deep. So anyway, that's kind of this, this reads to me like one of the YouTube comments we might get. So CrowdStrike and Delta, we'll have to figure out how they can work that out. Maybe it'll be in court, maybe they'll settle. Who knows?
[00:53:17] Speaker B: I can't wait for the Deja Vu section.
[00:53:19] Speaker A: Maybe it'll be like in. Yeah, yeah. No kidding. Yeah, because that might, that might. Was that, was that a segue? Is that what that was?
[00:53:25] Speaker B: Unintentionally, yes.
[00:53:26] Speaker A: Wow, look at Daniel.
[00:53:27] Speaker B: Unintentionally.
[00:53:28] Speaker A: Okay, that was a good segue because like Daniel said, maybe we'll see that in a Deja News segment in the future. But for now we actually do have a double feature. Deja News.
I don't like that he's over there, like looking off to the side like something's, like something's in that corner. It's unsettling. So, Daniel, this was something that you had shared with me that is pretty interesting and kind of reminded me of something we've talked about before. But let's, let's cover this current piece of news first. This, this blog post has information on some security research on something called Private Cloud Compute. And from my understanding there's some kind of a, there's a bug bounty to be had here. Somewhere in here there's a bug bounty.
[00:54:10] Speaker B: So Apple has recently created this private Cloud Compute PCC as they're calling it and it is like off boarding all sorts of fun stuff to private Cloud compute using AI. So their whole AI system and stuff is meant for people to be able to integrate and do flows. And there's all this fun stuff I've got.
If we jump over to my computer, there's actually like a workflow and a good write up on this. It says Private Cloud compute fulfills computationally intensive requests for Apple intelligence while providing groundbreaking privacy and security protections by bringing our industry leading devices security model into the cloud. That's how they're going to do this or are doing it. In our previous post introducing Private Cloud Compute. So you should go read that. We explained that to build public trust in the system, we would take the extraordinary step of allowing security and privacy researchers to inspect and verify the end to end security and privacy of pcp or the privacy promises of pcp. See, in the weeks after we announced Apple Intelligence and pcc, we provided third party auditors and select security researchers early access to the resources recreated to enable this inspection, including PCC virtual research environments. So today we're making these resources publicly available to invite all security and privacy researchers, or anyone with interest in technology and technical curiosity to learn more about PCC and perform their own independent verification of our claims. And we're excited to announce that we're expanding the Apple Security bounty to include PCC with significant rewards for reports of issues with our security or privacy claims. Now there's the security guides. You're going to want to get that. The private, private cloud compute security guide. You're going to need to download that and look through it. But when they say significant returns.
Let me, let me just kind of scroll down here.
[00:56:07] Speaker A: These are no small potatoes.
[00:56:09] Speaker B: There are no small potatoes.
[00:56:10] Speaker A: Quite a few zeros here in this chart.
[00:56:13] Speaker B: Here's the category. Remote attack on request data. Arbitrary code. Execution with arbitrary elements. Access to users. Request of data. Sensitive information about the user's request outside the trust boundary. So yeah, starts at a million bucks for the. Oh, that's the maximum bounty you could get.
Million dollars. Here's 1.250k. 150k, right? 100k. 50k. They ain't playing around. They're ready to spend some do Ray me on anybody that's got the wherewithal to gain access, run remote code, get their hands their grubby little digits all over your data. Yeah, they gonna pay the do Ray me, if you know what I'm talking about.
[00:56:51] Speaker A: Well, nothing motivates like money if you're trying to get people to help you find these, these bugs in your, in your systems. Hey, million dollars, not a bad price tag. But what this reminded me of was we talked about this back in June, which feels like forever ago. You might remember Kaspersky. Kasper Sky. Pronounce that however you want. I'm not going to come. Fact check.
[00:57:09] Speaker B: I've never heard anybody call it Casper.
[00:57:10] Speaker A: Right? That sounds like a, like a cosplay name. Like, I just don't. That's.
[00:57:14] Speaker B: Are they, Are you sure they were just trolling you?
[00:57:16] Speaker A: It was more than one person. So. I mean it would have to. It would have to been a group effort.
[00:57:19] Speaker B: Okay. Okay.
[00:57:20] Speaker A: I don't know, maybe. Maybe they're wrong. But anyway, you might remember. I'm going to say Kaspersky because I think that's right. You might remember when we talked about this back in June. That Kaspersky was able to find a bug that there was supposedly a million dollar bounty for and Apple refused to pay Spursky for it.
[00:57:36] Speaker B: They did.
[00:57:36] Speaker A: So this was the article that we had referenced at that time and I think there was some talk about how like that was right around the time.
[00:57:41] Speaker B: That their name ended in Sky.
[00:57:45] Speaker A: Yeah, yeah, Casper sky to be exact.
[00:57:48] Speaker B: It's a ghost flying around in the.
[00:57:50] Speaker A: Ether and not the friendly kind. No, but I think there was some talk about. Well the reason was because there were going to be sanctions placed on Kaspersky and so for Apple to pay them like it would have been like there would have been legal issues. So they just didn't pay out. But Kaspersky did find that bug. So it's like, I mean you had.
[00:58:07] Speaker B: Like, just like give it to a charity.
[00:58:10] Speaker A: Right. They weren't even like we don't even want the money. Yeah, give it to anybody else. And so I don't, I don't think Apple did not. Yeah, no, we're not going to do that. So that's what it reminded me of obviously in this case, I mean anybody could in theory go and find these, but anybody but Kaspersky, I guess they, they're out of luck. They can't touch it.
[00:58:26] Speaker B: Yeah. If you got a.ru yeah, yeah, yeah, no kidding. Apple don't want to hear about it. Keep it to yourself.
[00:58:32] Speaker A: Yeah, you're not, you're not welcome here.
[00:58:33] Speaker B: I guess if you want money for, you have to sell it as a zero day.
[00:58:36] Speaker A: Yeah, but don't do that, don't do that, don't do that. That's bad, that's bad. Crime is bad. Again, these are against crime. Yeah.
[00:58:44] Speaker B: The fact that it's in the Bible.
[00:58:46] Speaker A: That'S, that is true. That is technically an accurate source. Yes, it is in the Bible you.
[00:58:51] Speaker B: Are right, says don't do crime.
[00:58:53] Speaker A: And of course there's the other bug bounties that are listed there. If you're not really wanting to go for that big ticket one, it goes all the way down to this one's $50,000 bounty.
[00:59:01] Speaker B: So for attack on request data from a privileged network position. So for 50 grand accidental or unexpected data disclosure due to deployment or configuration issues.
[00:59:12] Speaker A: Yeah.
[00:59:12] Speaker B: So yeah, just basically.
Hey, what's this? Yeah, it looks like user data, so.
[00:59:19] Speaker A: Oh, I think it's neat that companies like Apple will have programs like this to hey, we'll give you some money, you help us out and keep, keep our customers safe hopefully.
[00:59:27] Speaker B: Yeah, you know I've seen a little bit of a Downward dip in popularity of. I say popular. What I mean is, I don't see as much fervor for bug bounties as maybe five years ago. Yeah, five years ago it was like, everybody's got to get into bug bounty. Everybody. You should do bug bounty. And it was all about these millionaire bug bounty hunters. And now it's just like, yeah, there's bug bounty. It's out there.
[00:59:49] Speaker A: Please, please come help us find these bugs.
[00:59:52] Speaker B: Please.
[00:59:53] Speaker A: 20 uneducillion dollars for the first Sega.
[00:59:55] Speaker B: I think what was happening was because of the fact that there were people making a very decent living doing bug bounty research. That. Yeah, that's how it kind of goes. Everybody goes, well, I want to make a million dollars, I want to make 400k a year looking for bug bounty. So be awesome. I can hack on systems and it is. You can absolutely do that. But not everybody is equipped. Yeah, not everybody makes it that far.
[01:00:21] Speaker A: Yeah.
[01:00:21] Speaker B: And actually like the vast majority of people don't.
[01:00:23] Speaker A: Plus you have to, I'm assuming you have to be careful because yes, you're looking for these bugs, but you don't want to break any laws.
[01:00:28] Speaker B: I think a lot of the problems started like, like people probably started becoming disenfranchised with seeing that as a, as a, a real avenue to employment, like being able to sustain themselves and make a living was because a lot of them will go, hey, you, you found that 100k bug, Betty. No, you didn't. Yeah, right, duplicate.
[01:00:47] Speaker A: We actually just found that. Right as you did.
[01:00:49] Speaker B: Yeah, it's already been disclosed. So this was a duplicate. And thanks for playing. But you don't get it. There's a lot of like, or they don't value it as much as you think it should be valued as. And so you don't get as much money. And there's no like standard. Yeah, that for everyone. Universally there are standards out there in which bug bounty programs work. But you have to convince the, you have to convince Apple, you have to convince Google, you have to convince whoever, you know, organization that you're bug bounty hunting for that it is a P1 and not a P3. Yeah, right. And a lot of times you end up holding on to things you find or, and chaining them together so that you can get that P1.
And it can be really time consuming and difficult. And a lot of these, especially the web applications that they create, are very complex. Takes forever to find the right chain of events to flow. Oh, there's an IDOR here.
So it's just, it wasn't as lucrative as everyone was making it off to be as, as you would see in the headlines.
[01:01:52] Speaker A: Yeah.
[01:01:53] Speaker B: For everyone. It seemed like from the headlines that everyone was just like throwing money in the air and having Scrooge McDuck fights with money. And that's. That wasn't the case. Most people were making like a few thousand dollars a year as like. Oh, it's like a little side hustle.
[01:02:07] Speaker A: It's like supplemental. Yeah, yeah, yeah, absolutely. Almost. Almost. More like a hobby. But you get paid to do it.
[01:02:12] Speaker B: Yeah.
[01:02:12] Speaker A: Freelance bug bounty and you get to.
[01:02:14] Speaker B: Put CVEs on your resume. So it is.
[01:02:15] Speaker A: Yeah.
[01:02:15] Speaker B: I still really think that bug bounty has a lot of value to it.
So I would, I would suggest you still like, oh yeah, it just may.
[01:02:23] Speaker A: Not be a full time gig for.
[01:02:24] Speaker B: Just don't think that you're going to just run into now if you become really good at it. Yeah. Obviously you could start self supporting.
[01:02:29] Speaker A: Anything's possible. There you go, Popsicle. You just never know. Well, that is not the. Thank you. That is not the only deja news piece that we have today. This next one, this is, it's just so interesting. The fitness app Strava, I think is how that's pronounced.
[01:02:43] Speaker B: Strava. Yeah.
[01:02:44] Speaker A: Gives away location of Biden, Trump and other leaders. French newspaper says. And apparently this is a fitness app.
[01:02:52] Speaker B: Yeah.
[01:02:53] Speaker A: So but it's not like Biden's going for a jog and he's got Strava turned on. It's the Secret Service agents.
[01:02:57] Speaker B: I'm sure that, you know, if you're looking at Biden's Strava app, it's just on a beach somewhere.
[01:03:03] Speaker A: That's true, that's true. But it updates like it. You could share your location. And so these Secret Service agents, there are some of them that use this app and did not turn that off or had that enabled or whatever. And so there were trips. I think specifically there was a French leader that took a trip and it was supposed to be a private trip to like, I think it was like a beachside resort or something. And it was not on his agenda. It was meant to be very private and people were able to figure out where they had been and you know, their exact moves, where they traveled because of this, this fitness app.
[01:03:31] Speaker B: I like this. You have all these. This does not bode well for the Secret Service, by the way. Yeah, they have. They've really been dropping the ball lately on there. Their skills and capabilities.
Hopefully they do better very, very, very, very soon. Yeah. Because yeah, you got an App. You're trying to be the epitome of health and fitness. Your job is to protect world leaders, you know, as a, as personal protection. Now, it's not just Secret Service, obviously. It was, of course, you know, President Marcon, Macron and battle Mayor Putin and all these other different people that are high value targets for whatever weirdo out there that wants to take shots at them.
But you think I'm under protection from these people. But they're the ones that are actually giving away your position. That's the opposite of protecting your principal, by the way, FYI. Didn't know if you knew that. So, yeah, you're doing personal protection and you have Strava. Turn that shit off.
[01:04:28] Speaker A: Well, and you would think that there'd be some rule against this. As a Secret Service agent, why do you. But I guess what it was is you can't use personal electronic devices while on duty. But if you're off duty, you can use social media and whatever. But so I guess if you're traveling with the President. Traveling with. Because the First Lady's locations could be pinpointed as well. They said. So if you're traveling with them and then I guess maybe, I don't know, you get, get. You get like a lunch break as a Secret Service agent and you're over there, you know, chomping on an In N out burger and are you.
[01:04:53] Speaker B: Then you're on Facebook app and Right.
[01:04:56] Speaker A: Like what's the. What's.
[01:04:57] Speaker B: I'm going to do 100 burpees on my lunch break.
[01:05:00] Speaker A: Meanwhile, I'm just burping on my.
So I wonder what the rules are surrounding that and how strict they are. I would think they'd be pretty strict.
[01:05:06] Speaker B: I would think that service. I would think it just me and you two. You good folks out there are smart. I'm going to outsource this to you. Tell me where I'm wrong is if I'm the head of whatever personal security entity that's guarding my principal.
I'm going to say your phones are issued. They have only line of business on there. Strava is not one of them. And here's why we call this. This is a double deja vu. I'm deja vuing my deja vu because guess what? This has happened before.
If you would please direct your attention to Sophia's computer where it says, heat map released by fitness tracker reveals location of secret military bases in 2018.
Just as soon as I saw this yesterday, I was like, I remember this. Is this. Am I, is this wrong?
[01:05:57] Speaker A: Like this name sounds familiar.
[01:05:58] Speaker B: And it revealed, like, secret CIA bases because the agents were using Strava specifically as a fitness app and it was reporting. And they were able to geolocate a secret base based off of the movements and geolocation of the application on their phones.
[01:06:16] Speaker A: Mm. And it wasn't just phones. It was like Fitbits and stuff like that.
[01:06:19] Speaker B: Right.
[01:06:19] Speaker A: So even if you left your phone and you weren't, you got a Fitbit. It's tracking where you're going.
[01:06:23] Speaker B: So since we've known that this has been a problem at least since 2018, if I'm the head of personal protection, I'm going to go, here's your issued phone.
You're not allowed to put any other apps on it than what's on it.
[01:06:35] Speaker A: Yeah.
[01:06:36] Speaker B: And Strava ain't one of them. Fitbit ain't one of them.
[01:06:38] Speaker A: If you want to be part of the Secret Service, I feel like that's not a terrible standard to have in place. That's not.
[01:06:42] Speaker B: You cannot bring your personal phone.
[01:06:45] Speaker A: Yeah.
[01:06:45] Speaker B: To. It goes in your locker and you don't get to use it. You're on duty. This is part of the job protection of your principal. Trumps any kind of personal. I want to do my, you know, I want to do Facebook posts, I want to make an Instagram post. No, you don't get to do that. You're on duty. Sorry.
[01:07:02] Speaker A: I would think to be any kind of a Secret Service agent, whether you're on detail with like a lower level politician or whatever.
[01:07:07] Speaker B: Yeah.
[01:07:07] Speaker A: I would think that standard would be in place and it would be a tough job to get to qualify for.
[01:07:11] Speaker B: Right.
[01:07:11] Speaker A: To be on duty, like protecting the President. And this is. Okay. I mean, obviously they're now taking. They just didn't know. And so now they're taking precautions. But like you said, this. This came up back in 2018. So it's like, well, how do you not. How do you not know? How are you not.
[01:07:23] Speaker B: How is this not a part of your policy now?
[01:07:25] Speaker A: Yeah.
[01:07:26] Speaker B: Right. That we don't. Because of the tracking of capabilities of phones and the apps that are on the phones. Like, honestly, a lot of things are tracked, like the organizations themselves, like Samsung and Apple, they're tracking where their phones are going so that they can. And Google, obviously. So if you got any phone with Google stuff on it, it's tracking to know. So you can tell.
It's starting to deja vu me yet again. The triple deja vu of the dude that put the bunch of phones in a wagon and Walked up and down his street.
[01:07:53] Speaker A: Oh, yeah.
[01:07:53] Speaker B: So that was reporting to the traffic apps that there was a slowdown in that area.
[01:07:59] Speaker A: Yeah.
[01:07:59] Speaker B: So people were avoiding his street and you didn't get the trend.
It doesn't even matter that it was Strava. Like, these phones are reporting all the time. Too many different organizations there should be.
If you are in personal protection, you high value principles. Nope. Sorry.
[01:08:18] Speaker A: Yeah. Why is your phone not locked down? More than that.
[01:08:20] Speaker B: Here's your brick phone.
[01:08:21] Speaker A: Yeah.
[01:08:21] Speaker B: You know, here's your dumb phone that doesn't have any of that.
[01:08:24] Speaker A: Here's your Firefly or whatever, your jitterbug. This is what you get. You can call 911, you can call the people you need to call. That's it.
[01:08:31] Speaker B: It's on a secured network. Yeah, that's it.
[01:08:34] Speaker A: Yeah. One of the points that got brought up was I didn't even think about this, that sometimes those agents will travel ahead of time before they're right. You know, whoever they're protecting, you know.
[01:08:41] Speaker B: Where they're going to be.
[01:08:42] Speaker A: So it's not even like, oh, you just know where the president is. You can then, if you were a bad actor, if you were an attacker, you could in theory figure out if you, you know, if you did the work. So definitely not a great look for some of the security. And like you said, it wasn't just Secret service. They list it out.
Twelve members of the French gspr, the security group of the presidency of the Republic, and then six members of the Russian FSO in addition to 26 US agents. So this is. This is an international problem.
[01:09:06] Speaker B: Yeah. They say we do not assess that there were any impacts to product, to protective operations or threats to any protectees. It added locations are regularly disclosed as part of public schedule releases. But not timelines. Sometimes, and not always just because locations are disclosed. They're not always disclosed for security purposes, especially timelines. Why do you think that, like Marine One or whatever it is, they have seven of them or whatever, and you don't know which one the President gets on and they fly off in separate directions.
[01:09:37] Speaker A: Yeah.
[01:09:38] Speaker B: If your fitness app is telling me which one he's on. Yeah, that could be an issue.
[01:09:43] Speaker A: It's like, reminds me of that game where, like, you put a ball under a cup and then there's like seven cups. But if the cup's transparent, it really doesn't matter. It's not really helping you, so.
[01:09:51] Speaker B: And maybe we're overblowing this, but it just seems like I think when it.
[01:09:55] Speaker A: Comes to national security, you can never Be too careful.
[01:09:57] Speaker B: Never be too careful. Right. Err on the side of caution.
[01:10:00] Speaker A: What do I know? I'm not a security guard or agent or whatever, but it just seems like something you would want to. You want to be cautious. We've got a couple more articles here. I purposely put these kinds of articles at the end because I know it'll be like we're running out of time and it encourages me not to dwell and get too angry. So anyway, this first one is. I'm not sure how I feel about this yet, but SAG Aftra has struck a deal with a company called ETH or Ethovox on AI voice protections. And this is also kind of a deja vu because we've talked about this before, how SAG Aftra was on strike and so they strike a deal. That's funny. They were on strike for a while. Our members were on strike because of. There was a lack of sufficient protections in their opinion, in place to protect voice actors or whatever the case may be from, you know, protecting their voice from being used to train AI or from being duplicated with AI. So in this case, this is a voice AI company, ethovox is, and SAG Aftra has struck a deal with them for SAG Aftra members who do want to make digital replicas of their voice. So saying basically the goal was never to just completely shut this down and make sure it never happens.
[01:10:58] Speaker B: Right.
[01:10:59] Speaker A: The goal was to provide a safe path for actors that do want to use their voices to train AI or make digital replicas and to protect the actors that don't from being taken advantage of or having their voice cloned without their permission. So I don't really know. I mean, I guess it. If people want to do it, go ahead. You have the freedom to do so. I personally don't agree with it. And I think that by doing that, you're kind of opening the door to some stuff. You just never know where that's going to lead. But I guess you can make that argument about anything. So I don't know how I feel about this as an actor, as somebody that's a little skeptical about some of this AI stuff. I just don't know. I don't know where I stand. Do you have any thoughts or opinions?
[01:11:37] Speaker B: My thoughts on this are it feels like we are really in a time where ownership of our data and likeness is important and we need to start creating some sort of safeguards around that so that a person gets to own their person. I get to own my voice. I get to Own my likeness, and that is mine. Right. And I can.
I can sell it to whoever I want, or more like lease it for contractual purposes, but another entity could never own it. And at any given time, I would have the rights to recall that license for even arbitrary reasons. It's mine.
[01:12:22] Speaker A: Absolutely right.
[01:12:23] Speaker B: If I've licensed it out to you, a part of the license agreement is that at any given time, I own this, and I want to remove your right to use it at any given time. And you might be like, well, that's not fair, because I'm creating marketing based off of the fact that we're going to use this. Well, yeah, it's not fair to you. What's fair to me? Since it's me and it's mine, that's all that matters. I don't care about you making a product and how you're using that product. As soon as I feel like I need to revoke that privilege, that's my right. Because it's me. That's my thoughts on this. So we probably need to do enact some legislation that protects people to say that I own my data, it's mine, you can't have it. I can license it to you so that you can make money off of it. But if I want to recall that, I can. That's where I feel about it.
[01:13:15] Speaker A: I would agree. And it's scary that at this point, like you said, protect your data. It's not even just limited to what you do online or the emails you send or whatever the case may be. It's now like your face print, your voice, your iris, like your fingerprint. I mean, fingerprints, I guess, have always been kind of an identifier, but we're seeing more and more of stuff that is like, okay, yes, it's mine, and there's nothing else like it in the world because it's. It's unique to everybody. But it's so. It's not like a password. If somebody gets my password and, you know, is using it, I can change it. I can't change my face print. I can't change my. I mean, I guess I could get surgery on my vocal cords, but I can't change my voice brand at the click of a button.
[01:13:47] Speaker B: Not very easily.
[01:13:48] Speaker A: Right. And I shouldn't have to. I shouldn't have to.
[01:13:50] Speaker B: And that's the other part of it. You shouldn't have to.
[01:13:52] Speaker A: Yeah. Is I. I should be able to keep my voice and not have to worry about somebody stealing it. And in this case, SAG Aftra is arguing, hey, we Want to make sure the big, the big things are informed consent and fair compensation when it comes to people that do want to work with companies and they don't mind the use of AI. Okay, that's fine. You can use my voice and duplicate it and use it for a character or whatever. But for, you know, every line recorded with my AI voice, you're going to pay me this amount. It's, you know, a dollar per line or whatever. I mean that's probably a, a high rate, but a dollar per line, whatever the case may be. Right. And consent. If you're going to do this, you are going to tell me and tell me exactly everything you're going to use my voice for so that I, my voice isn't appearing in ads for like hemorrhoid cream. Like I don't need.
[01:14:31] Speaker B: If you don't want it to, maybe you're cool with that today, but tomorrow you wake up and go, I don't like being part of hemorrhoid cream.
[01:14:36] Speaker A: Yeah, I don't want to advertise that.
[01:14:37] Speaker B: By the way, you should totally go for percentage and not like a set price.
[01:14:42] Speaker A: And never sign anything that says in perpetuity. Yeah, always, always, always double check with your, I don't know, legal counsel, agent, whatever. That was something that was drilled into me when I first started doing that.
[01:14:51] Speaker B: No perpetuity.
[01:14:52] Speaker A: No in perpetuity. That's bad. That means forever and ever and ever. So don't do it. And there are still some sagrafter video game workers on strike having to do with AI protections. They're in negotiation still. They came back to the table recently, but strike's ongoing for now. So anyway, if there's anybody else out there that's, that's kind of into that space of, you know, voice acting and AI voice cloning and whatever. There you go. That's the update on that. This last one. Oh, this is fun. Ubisoft, or however you want to pronounce that just quietly launched a full blown NFT game. It sure looks like something says IGN and oh boy, does it. So this is billed as a PvP tactical RPG game on PC. Which though that's not, you know, that's not totally foreign. We see other games like that, right? Yeah, RPGs have been a thing for a long time. But in this case though, if we, it's a Web three game and if we scroll down here, there's lots of ads. If we scroll down here. This is why I like read only. There is A. The Web3 comes into play as A method of collecting figurines to battle with which are, you guessed it, NFTs. And these things range from $7 to $63,000.
And that to me is just a little SARS hammer.
That's a little much for an in game figurine. I mean that's. Come on. I know that's the whole point of. I mean you're paying for a picture of a monkey. Thousands of dollars or whatever but come on, come on. Who's going to play this? Who's going to really do this?
[01:16:15] Speaker B: I just don't. I've never gotten the appeal.
[01:16:18] Speaker A: Yeah.
[01:16:18] Speaker B: I don't understand the appeal of wanting to purchase these in game items. Do these in game like especially an NFT, especially when it comes to NFTs. That, that junk has never made sense to me whatsoever. I'm surprised this is still a thing.
[01:16:36] Speaker A: Yeah.
[01:16:36] Speaker B: Honest with you, I really thought it.
[01:16:38] Speaker A: Was going to be a fad.
[01:16:39] Speaker B: Yes.
[01:16:39] Speaker A: It's just going to fade off.
[01:16:41] Speaker B: Right.
[01:16:42] Speaker A: Basically what it is. Except at least. At least you can hold the rock.
[01:16:45] Speaker B: Yeah. There's a rock.
Nft.
[01:16:48] Speaker A: This isn't even yours.
[01:16:49] Speaker B: It's like, correct me If I'm wrong, NFTs. Basically you pay for the right to say it's yours.
[01:16:55] Speaker A: Yeah. You have a license to it or something. So it's not even. At least with a pet rock. There's Google eyes on it. And I can hold it in my hand and say it is mine. And maybe I look like a crazy person, but I can hold that thing in my hand and that is mine.
[01:17:06] Speaker B: Sweet.
[01:17:07] Speaker A: And it's like it potentially it's free because I just get a rock out of my garden and I glue some eyes on, you know, to pay, even to pay $7 for an in game figurine. Personally I wouldn't do it. But people, people pay for stuff in game all the time. That's not new.
[01:17:21] Speaker B: But to me that would be. That it's not. It would be. It is an outrage. I paid for the game. I don't pay for access to the game.
[01:17:30] Speaker A: So that's the thing.
[01:17:31] Speaker B: Right.
[01:17:32] Speaker A: It's free to download, but you have.
[01:17:34] Speaker B: To have access how they make their money.
[01:17:36] Speaker A: Exactly. And you have to have a blockchain wallet to actually even play it.
[01:17:39] Speaker B: Stop doing this.
[01:17:41] Speaker A: And you get like a few figurines that are temporary to play with. But eventually if you want to progress in the game, it's like a pay to win kind of thing. You have to eventually start buying these figurines so you can't really get enjoyment out of it and pay and play the game to its fullest extent without.
[01:17:55] Speaker B: Without paying a bunch of money.
[01:17:56] Speaker A: There are plenty of other games that, yes, you can pay into it, but you don't have to.
[01:18:00] Speaker B: And then, of course, there's the ethical idea behind children that play games. Right. And they do not have. Well, some adults don't have the maturity to, like, know, oh, I shouldn't go rob my grandma's credit card to pay for in game purchases. Because children a lot of times do not have that kind of impulse control. Right.
[01:18:21] Speaker A: Or they don't understand.
[01:18:22] Speaker B: Right. They don't understand the difference about economy and interest and all the things that go into people's personal accounts. Like, they don't get that. So making games that could potentially take advantage of that, that should be illegal.
[01:18:37] Speaker A: And in this case, the it should be. If it's not, I'm not sure. I'm not sure if that's illegal.
[01:18:42] Speaker B: Is there some sort of law around this? Like, I just want to play a game that's fun.
[01:18:45] Speaker A: I'm curious.
[01:18:46] Speaker B: I give you. I give you dollars, you give me game. I play game, I go game. Fun.
[01:18:51] Speaker A: Except now you don't even own the game, Right? If it's digital, you don't even own it. You have the right to play it. And then they can revoke it anytime they want, because we talked about that. But in this case, this game does have an adults only age rating. It's listed as 18 or older. And so if you haven't confirmed you are 18 or older, then you can't play. But that has never stopped kids before. Games have been rated M for years. And it never stopped my brother from playing Call of Duty when he was in middle.
[01:19:15] Speaker B: There's an update to that. It says that Ubisoft did not submit Champions Tactics for a rating that the.
And that the adults only 18/4 rating icon in the original launch trailer was included in error.
[01:19:31] Speaker A: So where. I'm totally missing that.
[01:19:34] Speaker B: Yeah.
[01:19:35] Speaker A: Update. Let me see if I can find that.
[01:19:37] Speaker B: Yeah, update. 10:30 is as of today.
[01:19:39] Speaker A: Oh, I had to refresh the page because that literally happened while we were recording.
[01:19:43] Speaker B: Yeah, that literally happened.
[01:19:44] Speaker A: That's crazy. Wow.
[01:19:45] Speaker B: In real time.
[01:19:46] Speaker A: This just in.
[01:19:47] Speaker B: That's right.
[01:19:47] Speaker A: Ubisoft did not. So the adults only rating. Oh. Was included in error. So there's not even an adults only requirement. So the one thing that they could have said, well, technically, we're not exposing kids to it because this is an adult only rating and so your kids shouldn't be playing it if there's Not. It looks like anybody can play it. So you're right. Your argument about, like, you know, kids getting in here and not understanding fully stands.
[01:20:11] Speaker B: Right.
[01:20:11] Speaker A: They've got nothing to protect against that. So I just. So surprised. Not surprised, but just.
[01:20:16] Speaker B: Just disappointed.
[01:20:17] Speaker A: Just disappointed.
[01:20:18] Speaker B: You know, we're angry. It's that we're disappointed. We're just.
[01:20:21] Speaker A: We're just really disappointed in you. Do better. NFT games. Come on. Come on. Play Black Ops 6 like everybody else. Be normal.
[01:20:27] Speaker B: Like, do. Do you folks out there really want to play these kind of games? Is this. I just don't get it, personally. I. It's a. It's a subjective thing. I get it. I just don't get the appeal.
[01:20:40] Speaker A: And this is, to me, more extreme. Like, we talked about Gotcha games last week, right?
[01:20:44] Speaker B: Yeah.
[01:20:44] Speaker A: And I can see the appeal of those. And those types of games have been around for a little bit, and they're super popular. Like, Genshin Impact is a hugely popular game. But in this case, I'm like, okay, that is easy to. You know, those. Those transactions add up, and it's maybe just a little here, a little there to drop like a hundred bucks at once on a game on like a figurine. I mean, I hate to insult, but you'd have to be a little crazy.
[01:21:04] Speaker B: You know what it is, is I. I'm a kind of a big person, a big picture kind of person. I see. Oh, you've basically got me hooked in to a subscription model. That's weird. I'm not subscribed. But if I want to keep playing, I have to pay you more money. Yeah, right. That's basically a subscription.
[01:21:23] Speaker A: And I think with those Gotcha games, a lot of times it's like you. It's easier to progress if you spend the money, but you don't necessarily have to. But in this case, it looks like if you want to get past, like, that training phase, you basically have to buy at least one figurine.
[01:21:33] Speaker B: Right. And I'm not the kind of person that likes hemorrhaging money out.
[01:21:35] Speaker A: Yeah.
[01:21:35] Speaker B: Monthly. On monthly things. It's just not my gig.
[01:21:38] Speaker A: I'm not a big fan of monthly money hemorrhaging.
[01:21:40] Speaker B: Somebody might be like, well, Daniel, did you have streaming services? I have like, two streaming services, and I. I only ever have two. So if I'm like, you know what? I've been watching Hulu for a while. Let's go see what's on Netflix.
[01:21:51] Speaker A: You cancel your.
[01:21:51] Speaker B: I shut off Hulu and I turn on Netflix.
[01:21:54] Speaker A: Yeah.
[01:21:55] Speaker B: Right. Because I'm not against the idea and per se of streaming or subscription model.
[01:22:02] Speaker A: But I'm not going to have 10.
[01:22:03] Speaker B: I'm not. See, this is how it ends up working. This is why. Remember Netflix used to be like. That's where you went. That's what everybody had, Netflix. Why? Because had all the movies and all the shows, everything streamed through Netflix. Because nobody else had streaming services.
And then everybody said, you got our content. We're licensing our content to YouTube. Not YouTube. I'm sorry, Netflix.
And we're missing out on streaming revenue. So everybody now has a streaming service. Yeah, multiple streaming services. And if you want to watch this show, you. Well, you can't just have Paramount Plus. You have to have Paramount Network. Right. You can't just go with Discovery Plus. You have to have Discovery IG or whatever it is.
[01:22:46] Speaker A: Yeah, right.
[01:22:47] Speaker B: You have to have the three different ones. So you get all the things. And it's. Again now. Oh, well, it's only $4 here and $6 there, but nickels and dimes, that add real quick. Yeah, right. And they got you. And that's how they get you, right? Is they tell you it's. It's just a little bit of money. What's four bucks? That's a latte. That's one drink.
[01:23:07] Speaker A: Yeah. Skip your coffee for the day, you'd be fine.
[01:23:09] Speaker B: And then you got it.
[01:23:10] Speaker A: Yeah.
[01:23:10] Speaker B: And then we wonder why people is like, what's. What's the guy on? He's got the channel where he goes over people's finances.
[01:23:16] Speaker A: Dave Ramsey.
[01:23:17] Speaker B: No, not Dave Ramsey. Even though he does it as well. This is more like. He's a younger dude. Glasses.
[01:23:21] Speaker A: Okay.
[01:23:22] Speaker B: I don't think I'm familiar with curly black hair. And he looks over people's finances and tells them. This is. Oh, this is why you're in trouble. You have a brand new Toyota Tundra.
[01:23:29] Speaker A: Yeah.
[01:23:29] Speaker B: With all the bells and whistles. How much did that cost? Oh, $60,000.
[01:23:33] Speaker A: Yeah.
[01:23:33] Speaker B: Go sell it today. And this is what happens. We end up.
And if you don't have that, you got fomo, right? Oh, everybody's talking about the show. I gotta watch this. I gotta play that game. I gotta. Blah, blah, blah.
We do not give out financial advice. We are not financial advisors.
[01:23:51] Speaker A: The scrolling. The scrolling disclaimer text goes over the screen.
[01:23:54] Speaker B: Talk to the financial advisor to help benefit you financially. Professional. We are not them.
[01:24:00] Speaker A: We never claim to be.
[01:24:01] Speaker B: This is my opinion and my opinion only. Do not listen to me or take advice.
[01:24:05] Speaker A: And they do not reflect the opinions of our employer.
[01:24:06] Speaker B: That's right.
[01:24:07] Speaker A: Add that little. That caveat. A disclaimer. All right. That bullet don't kill your neighbor. There we go. We gotta. We just gotta throw that out. Scissors don't shin. Kick a trailer hitch.
[01:24:17] Speaker B: Oh, man. It will hurt.
[01:24:19] Speaker A: It'll hurt. And don't punch a cabinet. For sure.
[01:24:22] Speaker B: No, no, no.
[01:24:22] Speaker A: Well, that's all we had for our articles today. A little bit of a longer episode. But, hey, it's Halloween. Let's celebrate. We wanted to give you a little. Little extra this week. Plus, I missed work a few weeks ago, and I'm trying to make up for that still, so. Thanks so much for joining us for this episode, Daniel. I hope you had a good time listening to me get angry about some of this stuff. And, of course, thank you for joining us. Let us know what you enjoyed and what you want to see in the future. And until next week, we'll see you around.
Thanks for watching. If you enjoyed today's show, consider subscribing so you'll never miss a new episode.