Episode Transcript
[00:00:04] Speaker A: You're listening to Technado. Welcome and thanks for joining us for this episode of Technado. Quick reminder, tech Nado is sponsored by ACI Learning, the folks behind it pro. And you can use that code, Tecnado 30 for a discount on your it pro membership. Stop the timer.
[00:00:18] Speaker B: New record. Yeah. There it was. 20 seconds.
[00:00:20] Speaker A: 20 seconds.
[00:00:21] Speaker B: Not bad.
[00:00:21] Speaker A: I gotta beat my time every time. Thank you so much for joining us. We've got some interesting stuff we're gonna get into this week. Pull out some deja news pieces. Cause we do have some updates on some big stories. But of course, we've got, you know, breaking stuff and everything. And so I'm looking forward to it. How about you?
[00:00:35] Speaker B: Always a good time. Always a good time. I did have a question for you, though.
[00:00:38] Speaker A: Okay.
[00:00:39] Speaker B: Just because it's on my mind.
[00:00:41] Speaker A: Okay. Get it out of the way.
[00:00:42] Speaker B: I randomly got down this rabbit hole, but I thought it'd be fun to get your opinion and yours as well, because maybe you agree or disagree or whatever.
Michael Myers versus Jason Voorhees, who wins in this death battle.
Yes. See, I knew you would be like, hey, that's not an easy question to answer now that you say that.
[00:01:03] Speaker A: So don't be mad at me.
[00:01:04] Speaker B: I won't be mad.
[00:01:06] Speaker A: I don't believe in Jason Voorhees. I don't believe I've seen either of them in their entirety.
[00:01:12] Speaker B: But you know who the characters are.
[00:01:13] Speaker A: So that's all that matters. So Jason Voorhees is Friday the 13th, right?
[00:01:17] Speaker B: Yes.
[00:01:17] Speaker A: And he's the guy that. He was at the bottom of the lake or something?
[00:01:19] Speaker B: Yes. Crystal Lake camp. Crystal Lake.
[00:01:21] Speaker A: Yeah. And he. But he's like, he was drowned. He can't be killed or something.
[00:01:24] Speaker B: Correct. Right.
[00:01:25] Speaker A: I. Michael Myers, I thought, attacked Jimmy Lee Curtis, the babysitter.
[00:01:28] Speaker B: It was his sister.
[00:01:29] Speaker A: Yes, the sister. Okay. But he was in prison.
[00:01:31] Speaker B: Correct. Escapes also can't be killed.
[00:01:34] Speaker A: Okay.
[00:01:35] Speaker B: Unstoppable.
[00:01:36] Speaker A: Do we know the lore behind why either of them can't be killed behind.
[00:01:39] Speaker B: Both of them and what they can and cannot do? The reason I ask, if you haven't seen, there's this.
[00:01:44] Speaker A: There's like a Jason versus Freddy or something, right?
[00:01:47] Speaker B: There's a movie, Jason versus Freddie. Yes. I believe that Jason came out on.
[00:01:50] Speaker A: Top, but this is Jason versus Michael.
[00:01:53] Speaker B: This is Jason versus Michael, which has never occurred to my knowledge. I just thought there's such similar characters, and I happen to see the death battle. There is a thing. Boomstick. And so. And so I forget the name of the characters that do that show. Very entertaining. And they do, like, the science behind who would win.
[00:02:10] Speaker A: Yeah.
[00:02:10] Speaker B: And then they make an animation of them fighting. Very entertaining. I won't spoil it for you. So if you want to go watch that, that's my.
I want to hear Sophia's take on this.
[00:02:21] Speaker A: My inclination is to say Jason would win because I feel like he's got the advantage of terrain. So, like, okay, we know both of them can operate on land.
[00:02:29] Speaker B: Yes.
[00:02:30] Speaker A: I don't know that I've ever seen Michael Myers operate in water, but we know Jason has. Has a hold on that.
[00:02:35] Speaker B: He's got a. Jason pulls this into the aqua.
[00:02:37] Speaker A: He can get the. He can get the advantage.
[00:02:38] Speaker B: He might pull the win.
[00:02:39] Speaker A: Yeah.
[00:02:40] Speaker B: Due to.
[00:02:40] Speaker A: I don't know if Michael Myers is.
[00:02:41] Speaker B: Capable of home to, like, home courted.
[00:02:43] Speaker A: I guess if he can't die, he can't be drowned. But saying, like, we don't know how well he moved him and he's got that big jumpsuit on and the big mask. I would imagine he fills up with water.
[00:02:49] Speaker B: I would recommend. You should watch that episode. Very interesting and entertaining.
[00:02:54] Speaker A: I'll watch the movies first and then especially cause we're. I mean, I know it's only July, but fall's gonna be here before we do it. Actually, it's August as of the day this episode is airing. So. Happy August.
[00:03:01] Speaker B: Happy August.
[00:03:02] Speaker A: Almost autumn, I guess. But, yeah, it's coming up quick enough that I could watch those movies in preparation for.
[00:03:07] Speaker B: Yeah, that's right. We're almost there.
[00:03:08] Speaker A: We're almost there. It'll be there before we know it. But interesting question.
[00:03:10] Speaker B: I just thought it'd be fun to. That is random for everyone.
[00:03:13] Speaker A: When you said, I want to get your take on something, I really thought it was gonna be like something controversial. And of course that could be. I don't know.
[00:03:19] Speaker B: You never know.
[00:03:20] Speaker A: What do you think Jason or Michael or Freddie throw him in there? Maybe when it's a two on one, maybe that changes the outcome.
[00:03:27] Speaker B: In the dreamscape.
[00:03:28] Speaker A: In the dreamscape all day.
[00:03:29] Speaker B: Right?
[00:03:29] Speaker A: Yeah, that's true. That's true. He has the advantage of a different plane.
[00:03:32] Speaker B: That's his home field.
[00:03:33] Speaker A: Right. It's not terrain, is plane. So that's interesting. You're breaking my brain, Daniel. Too late in the day for this. Well, we have got some good stuff for you today. Hope you enjoy. We'll go ahead and get into it. It's our first and favorite segment. Breaking news.
Breaking news.
Wow.
[00:03:51] Speaker B: Even though I only one gunned it, I had to.
[00:03:54] Speaker A: You treat him equally.
[00:03:54] Speaker B: Smoke off of both.
[00:03:56] Speaker A: You gotta treat them equally.
[00:03:57] Speaker B: Yeah, I accidentally discharged underneath.
Got Sylvia right in the foot.
[00:04:02] Speaker A: It's fine. It's like goodfellows. I'm living my dream.
[00:04:04] Speaker B: Let's try it again after that.
[00:04:06] Speaker A: Living my dream.
[00:04:07] Speaker B: He's fighter.
[00:04:08] Speaker A: He told me to dance. That's what it was. Well, this first one we've got here, we got a couple of some bugs in our breaking news today. Patch now. Service now. Critical RCE bugs under active exploit, which is, uh. Oh, always so fun when we have a bug that is currently being exploited. Is there a fix for this? Do we know?
[00:04:25] Speaker B: Oh, I believe, I guess if this.
[00:04:26] Speaker A: Is patch now, it does say patch now.
[00:04:28] Speaker B: So that means there is a patch.
[00:04:30] Speaker A: That would make sense if you are.
[00:04:31] Speaker B: Running the service now. I think it's a service now. Right?
Software, yeah, that you should probably go and make that patch happen because as Miss Sophia and the article have said, this is actively being exploited in the wild. We got a couple of CVSS scores going on here. We've got this one, CVE 2024 4879 with a CVSS score of 9.3.
And then CVE 2024 5217 with a 9.2. If I'm remembering correctly, there's some issues with input validation that is allowing for, you know, the bad stuff. We'll call it the bad stuff today because that's the most technical term I.
[00:05:13] Speaker A: Can come up with a.
[00:05:14] Speaker B: To be allowed to happen. If I'm not mistaken, there's also some proof of concept code that is out there in the wall. Yep. A public proof of concept exploit was quickly published, paving the way for widespread attacks in the wild. So if you are running service now, I highly recommend you hit the stop button and go make that patch happen.
[00:05:34] Speaker A: Well, thank you for that, Daniel. Thank you for that breaking news because, yeah, 9.39.2, that's nothing to sneeze at. As far as that CBSS goes. That's not the only game in town, though. We've also got echo spoofing, which is a massive phishing campaign exploiting proofpoint's email protection to dispatch millions of perfectly spoofed emails. Perfectly spoofed. Well, at least they meet the standard. That's good. So. So this has been going on, to my understanding, since like January of this year, I think, at least.
[00:06:01] Speaker B: Yeah, sure.
[00:06:02] Speaker A: Going on a while. Um, and just a few years ago, this was maybe not as big of a concern. And now they have been launching female phishing email campaigns, taking hold of identities like Disney, IBM and Coca Cola.
[00:06:14] Speaker B: Yeah, so the real issue here is that proof point, right? Is it proofpoint? Is that the. Yeah, proof point, right. They are very popular to be used as for email validation and routing and things of that nature. So it's highly likely that you are hitting a proofpoint server from time to time when it comes to your email, especially a lot of these larger organizations. This is a long article. There's no way this is a full on masterclass on email spoofing. So enjoy. If you're really interested in what's going on here, I do believe that proof point has taken steps to, you know, fix this, triage the problem and now should not be as much of an issue. But what it does bring up is the fact that we tend to rely on certain protections to keep us safe and that just because they're, they are there does not necessarily mean that we are safe from those old tried and true attacks like spoofing. Back in the day you wanted to spoof something, you logged into an SMTP server, you threw a couple of commands on the command line at it, and voila.
A perfectly crafted spoofed email. And it took someone with a very discerning eye to be able to determine that this was what that was. But now we come up with things like the DKIM and SPF for helping us keep those things a little more secure. Hey, are you on the authorized senders list? If not, see you, you're out of here. That's what SPF does. Dcam where it's like has this been digitally signed? And is it who it's supposed to be from?
Cool. Guess what? Through this little snafu and the way that the proofpoint servers could be stood up, what could cause those things to actually occur on a spoofed email? So if you're coming in from something like Google, like Gmail or outlook or, I'm sorry, like Microsoft 365, there's some pretty easy set it and forget it type of, or there was at least settings in proof point to that would allow, hey, if I'm getting this from Gmail, just send it on through and the same kind of thing. So I could create a spoofed email in one of those services, send it through proof point, proof point with, then basically think this is, you know, they use like Disney and things of that nature. Coca Cola. Yeah, it's Disney d Kim it and SPk s p that sucker all day long. Send it out. Now it's coming and it looks like it is straight there. It has all the right credentials allowing for, like I said, perfectly crafted spoofed emails. So that was the, that was the highlights of what I got in the five minutes I tried to read. It is long tome of an article.
[00:09:00] Speaker A: It's very in depth, which is great.
[00:09:02] Speaker B: It is good. So you have all the details within this article if you need to take a look at what's going on there.
[00:09:06] Speaker A: Yeah, we'll, we'll link it if you do want to get into the details of how this works. But yeah. Started around January 2024. So when the activity started to pick up. Daily average of 3 million perfectly spoofed emails. But the highest peak was a daily number of 14 million, which is crazy to me, I guess, in the grand scheme of world population of 8,014,000,000.
[00:09:23] Speaker B: Yeah, well, I mean, we do a lot of email out there. There's a lot of. It is spam.
[00:09:28] Speaker A: Yeah.
[00:09:28] Speaker B: So, you know, you just gotta.
We. You could be of doing. You could be doing your due diligence there. That's a tough sense to say.
And it still wouldn't have mattered because of this problem. Just goes to show you, you don't just click your way through an installation. A configuration, to be very exact. You need to be very, like, intentional. That's the word I'm looking for. Very intentional on what you do and how you set that up. Because if you start hitting the easy buttons on stuff, Sophia, as you know, as you've learned. Right. What is it? Easy.
[00:10:01] Speaker A: Oh, sliding scale security and easy.
[00:10:04] Speaker B: If it's getting too easy, it's probably a little less secure.
[00:10:07] Speaker A: Yeah, that's true. If your password can be five characters and all of them can be the letter a, it will be. It's easy for you to remember.
[00:10:13] Speaker B: That's right.
[00:10:13] Speaker A: But also easy for people to figure out.
[00:10:15] Speaker B: So good to know.
[00:10:16] Speaker A: That was a good one. That's a good piece of breaking news. So stay vigilant. Right. As always, stay vigilant. Really quick. A couple other things, just a little sad respect fact here. The Xbox 360 store did close this week officially, so shut down. I know. It's very sad. That's probably the first console I played on and it was, you know, just rest in peace.
[00:10:35] Speaker B: I loved the 360, man. It was a great console.
[00:10:37] Speaker A: Yeah, it was quite enjoyable. I really liked, that was my first console I owned. Really?
[00:10:42] Speaker B: Yeah.
[00:10:43] Speaker A: Huh.
[00:10:43] Speaker B: Like, I still have a Wii, but like, my last gaming console that I bought and was like, yes, I love this thing.
Actively buying games was a 360.
[00:10:52] Speaker A: It was a beast as far as, like, the selection of games. Like, you could do a lot with the three. And now it's like, oh, yeah, it's very limited. Right. And, oh, you have to get the online code. But then we might actually discontinue that in a year. Like, you know. So it was.
[00:11:05] Speaker B: That sucks, man.
[00:11:06] Speaker A: Sad day. The store's closed.
[00:11:07] Speaker B: Mind having a 360? Honestly, they're still. That's great. Library games. Of course, all the xboxes are backwards compatible, aren't they?
[00:11:14] Speaker A: I think so. For most games, yeah. Yeah, for most games. So if you do have an Xbox 360. Hold it closer today. Hold it a little tighter.
[00:11:21] Speaker B: Daniel, care of AC right to the office.
[00:11:26] Speaker A: And then Xbox one S are having firmware updating issues. A lot of people with Xbox one S are noticing they can't update their firmware anymore. Really?
[00:11:33] Speaker B: What's on that? Was it Series X or whatever it is? What's the.
[00:11:37] Speaker A: Yeah, the latest is either the X or the S. I think both. Okay. Christian says both.
[00:11:42] Speaker B: Both of them are.
[00:11:43] Speaker A: It's hard to keep.
[00:11:44] Speaker B: Which one's the top dog?
[00:11:46] Speaker A: Series X.
[00:11:47] Speaker B: Series X.
[00:11:48] Speaker A: Okay, thank you, Christian.
[00:11:49] Speaker B: Yes.
[00:11:49] Speaker A: I look up, the voices in our.
[00:11:50] Speaker B: Head have told us this is the Series X.
[00:11:52] Speaker A: Look up like it's God speaking to us.
[00:11:55] Speaker B: It's that.
[00:11:56] Speaker A: Thank you.
[00:11:57] Speaker B: You haven't seen it, most likely, but in real genius, when they got this guy, the character Kent, they put a receiver and a transmitter in his, like, head gear, and he's there talking to him in a modulated voice. Kent, I see you. Kent. Jesus.
[00:12:15] Speaker A: The Series X solos.
[00:12:17] Speaker B: Yes.
[00:12:19] Speaker A: So a couple pieces of, I guess, breaking news if you are an Xbox fan. And then Microsoft is also experiencing another outage. Right now it's affecting Microsoft 365 and Azure services. It's relatively recent. We're filming this little bit in advance of the rally states. So maybe as of the time this episode is being aired, they'll have a fix. But probably this is pretty breaking in that we're not sure of all the details yet because we got a team's.
[00:12:42] Speaker B: Message about it today.
[00:12:43] Speaker A: Oh, really?
[00:12:44] Speaker B: Yeah. You didn't see it from rit department?
[00:12:45] Speaker A: Sure I did. I always check.
[00:12:47] Speaker B: They get onto me, say I don't check my teams.
[00:12:50] Speaker A: It was a busy morning. A lot going on. All right. Yes. So that's going on. We'll have to see if there's any updates. And I'll definitely check my teams a little bit later after this.
That's all I had for breaking news was fun today. Breaking news is fun today.
[00:13:02] Speaker B: I mean, several horrible things that are happening.
[00:13:04] Speaker A: Yeah. I mean, like the Xbox 360 officially dying.
[00:13:06] Speaker B: That is the worst.
[00:13:07] Speaker A: I mean, that's the worst.
[00:13:08] Speaker B: This is a. Yeah, this is a black day today.
[00:13:10] Speaker A: Forget fishing emails.
[00:13:11] Speaker B: Dark times are upon us.
[00:13:12] Speaker A: Well, you wore black. I didn't prepare.
[00:13:14] Speaker B: This is my Wild west hackenfish shirt from.
[00:13:16] Speaker A: From last year. No, no, that's. That's year before, right?
[00:13:19] Speaker B: No, this was last year, right?
[00:13:20] Speaker A: No, last year was the X Files one.
[00:13:23] Speaker B: Was it?
[00:13:24] Speaker A: Yes, it was.
[00:13:25] Speaker B: Right?
[00:13:25] Speaker A: I. I want to be Leet. That was the last.
[00:13:27] Speaker B: That's right.
[00:13:27] Speaker A: That was last year's theme.
[00:13:28] Speaker B: That's right. That was last year's theme.
[00:13:29] Speaker A: The only one I've been to, so.
[00:13:30] Speaker B: It'S hard to forget losing my mind.
[00:13:32] Speaker A: I thought you had a hold on this one. Because this year's like, what, space cowboys or something like that.
[00:13:35] Speaker B: Or they're calling it space cowboys, but it's really supposed to be Firefly.
[00:13:40] Speaker A: Oh, okay. Yeah, I haven't seen that one.
[00:13:43] Speaker B: Due to legal reasons, they cannot officially badge this.
[00:13:46] Speaker A: They can't afford the copyright as. Yeah, okay.
[00:13:48] Speaker B: That stuff.
[00:13:49] Speaker A: I'll add Firefly to my list then, for unreasonable dudes.
[00:13:51] Speaker B: Firefly. Great show. 14 episodes.
[00:13:54] Speaker A: Really?
[00:13:55] Speaker B: All there is.
[00:13:56] Speaker A: Okay.
[00:13:56] Speaker B: Right? Great show. I started off. If I'll just tell you my. I remember when it was on tv, but I didn't watch it, so, like, okay. I was, like, in the middle of other stuff.
[00:14:06] Speaker A: Sure.
[00:14:06] Speaker B: Years down the road, I'm watching random movies. You know, friends are like, oh, you gotta watch this. You gotta watch this. Friend of mine's like, you're gonna love this serenity. I'm like, okay, cool. Pop it in as it goes. I'm like, wasn't this a tv show?
Yes, it was. And they got their own full length, even though it had one season. So I'm like, oh, I got it. I would have ran out. Bought the box set for the show power. Watched it. Just binge that sucker as fast as I could and was enamored with the show. Loved it. Loved it. My friend George, he gets pissed. He's like, one season, really? This is all we get. Really? Send him off in a tizzy about that show. But it was so popular, even though it got cancelled, that the fans clamored hard enough that the studio was like, if we give you a movie, will you shut up? And they said, it's better than nothing. We'll take it.
[00:14:58] Speaker A: Oh.
[00:14:59] Speaker B: And so they did. And they gave him a full movie, and it was great.
[00:15:02] Speaker A: I'll have to look into it, especially if it's only 14 episodes. I can do that in a week.
[00:15:04] Speaker B: Oh, yeah. There's nothing to it.
[00:15:06] Speaker A: Yeah, that's easy. Okay.
[00:15:07] Speaker B: Very entertaining.
[00:15:08] Speaker A: I will add it to my homework list before we go to wild Bess. I confess this year so that because I did not understand the X Files Reverend solicit because I'd never seen it. And as I'm watching it, I'm starting.
[00:15:16] Speaker B: You're still watching X Files, right?
[00:15:17] Speaker A: Yeah. I took a pause because I'm watching the Sopranos to just watched home. Oh, yeah. Yeah, that's the one.
[00:15:22] Speaker B: That was like, dude, it was. To this day, I've seen it ten times. It's still that disturbing.
[00:15:28] Speaker A: It's still, like, pretty heavily censored because of how disturbing.
[00:15:31] Speaker B: It only aired once.
[00:15:33] Speaker A: Okay.
[00:15:34] Speaker B: And then it was like, if they the, like in full and then the subsequent times that it was aired in syndication. It was. Yes.
[00:15:42] Speaker A: It was censored because of how disturbing it was.
[00:15:44] Speaker B: Yeah, it was a very disturbing episode.
[00:15:47] Speaker A: I got some work to do then.
[00:15:48] Speaker B: Yeah.
[00:15:48] Speaker A: I feel like I get some bonus content every episode. New homework to add to my ever expanding list.
But moving on from our breaking news, getting into our articles for today, this first one, it's our good friend Meta and good old Zuck. Good old Zuck Meta removed 63,000 Instagram accounts linked to nigerian sextortion scams. So in this case, it's a win for meta, I would say.
[00:16:09] Speaker B: Yeah.
[00:16:09] Speaker A: This was a good thing.
[00:16:10] Speaker B: This was a. It was an absolutely good thing. I'm glad. Here's what. Here's. Here's what kills me.
63,060. 3000 accounts dedicated to the non fun that is sextortion scams. Right?
You're not familiar with this. I don't know how much we can stay and keep this family friendly, but suffice to say, es no bueno, okay? Not cool. As Nic cage would say, not cool.
Right. At all. And for that. So that's kind of, like, was my big takeaway on this article, right? Was that this was a one time shot, and they pulled down 63,000 accounts. How many other accounts? This opens the larger conversation of, right. How much of online activity is people and how much is, like, scams and bots and bullshit.
I'm just saying it, you know, isn't.
[00:17:16] Speaker A: There, like, a theory that, like, in however many years, the Internet's gonna be basically useless because it's gonna be, this is just AI and bots talking to each other. Yeah.
[00:17:22] Speaker B: Yeah. That will be the singularity.
[00:17:24] Speaker A: Yeah.
[00:17:25] Speaker B: Where we just become, you know, online is useless for us because it is controlled by our digital overlords.
[00:17:35] Speaker A: Since they, in this, like, you said 63,000 is a lot, but that just. Yeah, it makes you wonder. This is just the ones that they took down.
[00:17:42] Speaker B: Yeah.
[00:17:43] Speaker A: So how many accounts are there out there that were missed? Because it's hard to catch all of them. It's not. It's not like Pokemon. You can't catch them all. It's very difficult. So it does say that in this case, there was this smaller, coordinated network that targeted a lot of adult men, but it is referenced later in this article, the Yahoo boys, which was that group that they targeted a lot of teenagers.
[00:18:01] Speaker B: Yeah, kids. And unfortunately, that ends not, well. A lot.
[00:18:05] Speaker A: Yeah.
[00:18:06] Speaker B: Right.
[00:18:06] Speaker A: Because they blackmail them.
[00:18:08] Speaker B: Right. And then they're so freaked out about what's getting ready to happen to them.
[00:18:11] Speaker A: Yeah.
[00:18:12] Speaker B: Right. That they. They do things that they.
[00:18:14] Speaker A: They self check out.
[00:18:15] Speaker B: Yeah. Then they shouldn't do that. Right. That's why I hope they put these people under a prison if they. If they can catch them.
[00:18:22] Speaker A: Yeah.
[00:18:22] Speaker B: Right. Because that's what you deserve. Like, you are the complete worst of the worst.
You are right there. You. I'll just stop because I'm. I'm gonna go down a rabbit hole.
[00:18:33] Speaker A: I don't blame you.
[00:18:34] Speaker B: Of anger and disgust.
[00:18:35] Speaker A: I don't blame you. This is a. One of the things I saw this week about Meta. Cause I feel like in recent news, a lot of the stuff we've seen about meta has been, like, maybe not so great. Like, I know they were kind of in some trouble, I think, in the UK for they were violating some form of, like, some part of GDPR.
[00:18:50] Speaker B: Oh, yeah, GDPR.
[00:18:51] Speaker A: And so I feel like a lot of the stuff I've seen about them recently has been like, oh, Metta, what are you doing? You're getting yourself in trouble. But I saw, like, a part of an interview with Mark Zuckerberg over the weekend, and it seems like he's.
I don't want to, like, defame him. Not that he's not normal, but he's got a reputation for a reason. Right.
[00:19:07] Speaker B: Like how you embezzled the fact that he's not normal by saying, I don't want to say he's not normal, but.
[00:19:13] Speaker A: He'S got a reputation for a reason. There's jokes, you know.
[00:19:16] Speaker B: Oh, yeah. The whole, he's an Android or. Yeah.
[00:19:18] Speaker A: Because he just. He's a little awkward in his behaviors. But I. The more, like this interview he did, he's kind of becoming normal. He was talking about how you hear these people, these tech companies that are like, you could just upload your consciousness into the cloud. And da da da. And I'm like, that's B's. You can't do that. That's not what makes you a human. It's your heart and your mind and your actions and da da da. And went on this tangent and I was like, okay, Mark.
[00:19:42] Speaker B: Yeah. When did he get red pilled?
[00:19:44] Speaker A: Right. What happened? Because I don't know.
[00:19:46] Speaker B: They say that, like, starting to work out and things like that is the, is the road.
[00:19:50] Speaker A: He touched grass.
[00:19:51] Speaker B: He did. He did.
[00:19:52] Speaker A: And it's curing him.
And, yeah, obviously he's a very smart guy. I'm not like trying to christians that it was a software update.
Zuckerberg 3.0 is GPT four Zuckerberg edition. That's a good one, Christian. I wish we had like a on him.
[00:20:09] Speaker B: Well played.
[00:20:10] Speaker A: After he makes the magic behind the scenes, he was responsible for our boot loops graphic last year. We got a lot of people commenting.
[00:20:15] Speaker B: About that, saying it was my idea, but he brought it to reality and he did it in a spectacular fashion.
[00:20:20] Speaker A: It was good teamwork between Daniel and.
[00:20:22] Speaker B: Teamwork makes the dream work, man. With a, with my ideas and his creativity and his skills. Dream team, we brought to you the boot loops. I was, I told him this morning, I was like, I love that graphic. You did such a good job with that.
[00:20:31] Speaker A: It's a shame we can't use it again. Yeah, not that I want that to happen again, but, like, it's just such a shame we only get one use out of it.
[00:20:37] Speaker B: Single serving.
[00:20:38] Speaker A: We might talk about crowdstrike later this episode, so maybe we can reuse it. Who knows?
[00:20:41] Speaker B: We'll have to stick up on the screen.
[00:20:43] Speaker A: But yeah. This week's top three people in cybersecurity, Christian, Daniel and Mark Zuckerberg. So thanks for doing that meta. That's pretty cool of you. And we hope that they continue to continue to dismantle their systems.
[00:20:52] Speaker B: Kick their ass.
[00:20:54] Speaker A: Well, in other big tech news, slightly less positive x begins training Grok AI with your posts, and this article goes into how you can disable it as well. The reason I wanted to talk about this is because I know this is your article. This is. Yeah, this is one that I polled. And the reason that it stood out to me is because I, I will admit I'm an ex user. Okay. I probably am on social media more than I should be. I know a lot of people use it. I could touch some grass. I know. And even when it was Twitter, like, I just. I was probably on there too much. But, you know, it's fun to scroll. So anyway, yeah, laugh at. Yes, they know they've got me, got me.
[00:21:31] Speaker B: Like, you know, if we just make another thing after this, they'll go to it, they'll never leave.
[00:21:36] Speaker A: And every, every big tech company, every social media company has their own, like, AI type thing going on. Obviously there's chat, GPT, it's part of OpenAI, right? There's that whole thing. But, like, meta AI is a thing. I go to search something in Instagram and it, like, doesn't take, and it thinks that I'm asking the AI a question and it opens a chat with the AI. I'm like, stop it. I don't want to talk to your AI. But in this case, this is Elon Musk's X version, it's called Grok. And this is not like a new thing, having, like, an AI trained on the input of your users. I think that's part of why Meta was, was getting in trouble. But in this case, like, Instagram's free to use, right? Facebook is free to use.
[00:22:09] Speaker B: Correct.
[00:22:10] Speaker A: And the argument was, like, free, free as in you don't pay any money. Yes, but like, I think I thought what it was, was they were going to offer, like, if you want to pay for, like, an upgraded version, then we won't use your data or then you won't get ads and things like that. And it was like, well, that's not really fair. In this case, X is free to use, but Grok, you have to pay for, you have to become a premium user to use Grok, but they're scraping your data to train the AI anyway. So they're training the AI that you don't even get to use unless you pay for it.
[00:22:38] Speaker B: Yeah, it's a little weird.
[00:22:39] Speaker A: So that to me was like, huh? Like, it's one thing if I pay for the AI, and you say by paying for this, you agree to da.
[00:22:44] Speaker B: Da da da at the end of the day, right? Well, you do.
By using the platform you agreed to, the ULA that says grok gets to scrape your data.
[00:22:53] Speaker A: That's true.
[00:22:54] Speaker B: And learn based off of what you post, what you watch and what you like, what you do, and how long you stay somewhere and all the 17 other billion data points that it is looking at, it does seem unfair. And maybe you think that is unfair as well. I mean, I feel like there's probably quite a bit of people out there that might go, yeah, that's a bridge too far.
If I am going to give you my data to train it, I want access to the thing so that I can use it.
[00:23:24] Speaker A: Yeah, exactly.
[00:23:25] Speaker B: So this is where you've got to make your voice heard and go, hey, I want access to Grok. If I'm going to train Grok, I want access to Grok. And I'm going to stop using the service and or start some sort of grassroots, go to wherever, you know, do whatever. I don't know how to.
[00:23:42] Speaker A: Or I guess it's just delete changes.
[00:23:43] Speaker B: But, like, memes is a good way.
[00:23:45] Speaker A: Yeah, that does get Elon's attention.
[00:23:47] Speaker B: Right? You get some good meme game going on, and. Yeah, he'll probably see it.
[00:23:51] Speaker A: He will probably see it.
[00:23:52] Speaker B: That is funny.
[00:23:53] Speaker A: And sometimes, to his credit, he'll see somebody say something and be like, oh, yeah, you're right. That's kind of b's. Let me see if I can fix that. To his credit, I've seen that several times.
[00:23:59] Speaker B: I have as well. He is an interesting individual as well.
[00:24:04] Speaker A: But in this case, it was very like, they quietly started rolling this out and saying that, we're going to start training our AI with your stuff because Grok was not around when it first converted, and so they give you the option to, like, not. Not participate in that, but they don't. There was no alert that, hey, by the way, we're doing this now, and it's opt in by default, so you have to go in and hunt it down in the settings.
[00:24:26] Speaker B: So you are allowed to opt out without any kind of reprisal or percussion.
[00:24:31] Speaker A: You can.
[00:24:31] Speaker B: Yeah, suppose get a less customized experience.
[00:24:35] Speaker A: Right, exactly.
[00:24:36] Speaker B: Whoop dee doo.
[00:24:38] Speaker A: Oh, no. What am I gonna do?
[00:24:39] Speaker B: Hey, you know what to me like, I like rolling the Internet dice. Let's just see what we get. I don't want a personalized experience. I don't need you to. Hey, if I need something, I'll find it.
[00:24:52] Speaker A: Yeah.
[00:24:52] Speaker B: I don't need your help. Thank you. I don't need you to know what side of the bed I sleep on and when I go to the bathroom and all the other things. Oh, you're Daniel. You're almost out of toilet paper, blah, blah, blah. Which these things do. I know when I'm almost out.
[00:25:06] Speaker A: You don't have to tell me.
[00:25:07] Speaker B: I go, look at that. The role looks small.
I'll go buy more. Yeah.
This is weird how we're outsourcing these things to robots. All right, but if I can go in and just turn this off, then, yeah, they probably could have been a little more vocal about it.
[00:25:24] Speaker A: It's a little sneaky. It's a little sneaky. But I guess that's not sneaky.
[00:25:27] Speaker B: It's not necessarily. Yeah, maybe you could see this as sneaky.
[00:25:30] Speaker A: I'm sure, like, X is not the first platform to do something like this.
[00:25:33] Speaker B: No, that doesn't make it right. It's like, that's a two quoque argument, right?
[00:25:36] Speaker A: I do hate the, like, whole, like, it's opted in by default and you have to opt out. But I guess if they didn't opt people into this by default, who would go in and turn that on on purpose?
[00:25:44] Speaker B: Right. Well, and that's where they should just give you a notification as you log in. Hey, by the way, hey, we have Grok. You can't move past this until you answer these questions. Yeah, you must scroll and read, but that pisses people off and makes them go, ah, screw it, I'm just gonna turn this off. No, you're not. You're going in.
[00:25:59] Speaker A: If your account hooked, like, private, which the account I primarily use is private because I don't want people to find me on there. Don't hunt me down. It's my private space. My private personal. I never post. I only sock puppet. Okay, well, there you go. I only ever repost.
[00:26:11] Speaker B: You should get a sock puppet account.
[00:26:13] Speaker A: Oh, okay. Well, there you go. I only ever repost, like, memes and stuff. So that's part of why I'm like, I don't want you seeing what I repost. It's stupid. But anyway, if your account's private, it doesn't. Grock, can't use it to train. But if your account's public, this is something that you should probably know about. So. Wanted to mention that.
[00:26:26] Speaker B: Good to know.
[00:26:27] Speaker A: Yes. But moving on from our good friends at X and Meta and OpenAI and all of the above, going into our next segment, this is behind bars.
[00:26:45] Speaker B: Break the law.
I'm not hearing the music or anything.
[00:26:48] Speaker A: I'm sure it's going.
[00:26:49] Speaker B: Everybody's probably like, why are they just sitting there in total and utter silence?
[00:26:53] Speaker A: We just like to look at you.
[00:26:54] Speaker B: But it will be in post. They will have that.
[00:26:56] Speaker A: I'm sure you should have been taking your monster can and, like, clinging it against.
We've got a bit of a double feature for this behind bars. First up, we have a former Avaya Aveya employee that got four years for an $88 million license piracy scheme. And of course, they have a pirate flag in case the piracy wasn't clear enough. But three individuals that orchestrated real flag.
[00:27:18] Speaker B: I say I generate it.
[00:27:20] Speaker A: It may very well be. Honestly, this is the world we live in. It probably is. I don't know the signs to look for, but it's probably a generated. I'm a little, I'm a little cynical, but these three individuals orchestrated a massive pirating operation involving the sale of Avea business telephone system software licenses. I had not heard of this company before this article came up. But 88 million is not chump change.
[00:27:41] Speaker B: No, that's a lot. It's a weird. So it's from, to me, when I read this article, I'm reading the whole thing thinking these people stole $88 million.
That, because that's kind of how they bill it. And then at the end it's like it says that it's something to the effect of, oh, here we go. The DOJ says Hines bought the ADI licenses and sold them to resellers and end users worldwide at a much lower price than their design costs, causing massive losses for Avaya estimated at $88 million. Oh, so.
[00:28:17] Speaker A: So how much money they actually accumulated from it?
[00:28:19] Speaker B: Correct. Cause I'm like, when you look at how much they had to repay, which was, one of them had to pay 4 million, another one had to pay 4 million. Another one had to pay 2 million. I'm like, that's not $88 million. Did they just blow that money? How do you blow, like $78 million?
[00:28:37] Speaker A: Yeah, that's.
[00:28:39] Speaker B: I mean, I guess you can. It's totally possible.
[00:28:41] Speaker A: Yeah.
[00:28:42] Speaker B: But it's time to start selling assets at that point, right?
[00:28:47] Speaker A: Buying houses. That's what they do.
[00:28:49] Speaker B: They're in the housing market day they're opening Airbnbs, investing in the Blackrock State street. Right. This is. This is their thing.
[00:28:58] Speaker A: Then. That makes more sense then, because I saw four years, I was like, that's a pretty insignificant amount of time for 88 million.
[00:29:03] Speaker B: Right.
[00:29:03] Speaker A: But that makes sense.
[00:29:04] Speaker B: There you go. Right. They're saying that accumulated up to an estimated $88 million. When articles do that, just say, here's how much we know they lost, or this was an estimated losses of 88 million, not that they stole $88 million. I hate when they do that. Yeah, don't make articles. Stuff like that, man. Don't do it.
[00:29:23] Speaker A: Yeah, that is interesting. Hmm. Well, they're going to serve their time. I thought it was interesting. One of them is serving.
[00:29:27] Speaker B: Drivers must know about this new trip. That's. That's what it is. Number three will shock you.
[00:29:33] Speaker A: They hate him. Drivers hate him. One of them was sentenced to four years. One was sentenced to a year and six months, and then one was sentenced to a year and a day. Is that normal? Is that like a common weird. It's a very odds. Like, I'm sure something to do legally, maybe.
[00:29:46] Speaker B: Right?
[00:29:46] Speaker A: Because this year was a leap year or something. And, like.
[00:29:50] Speaker B: No, they would have just said a year.
[00:29:51] Speaker A: Okay.
[00:29:52] Speaker B: Right. I would. I would think.
[00:29:53] Speaker A: Yeah. It's just odd. It's very specific.
[00:29:56] Speaker B: So maybe it has something to do with where they're spending their time.
Right. Because I think that, like, county jail facilities, you. I think they only hold people up to a year, and then if you're doing over that, you're going to a prison.
[00:30:11] Speaker A: But if it's just a year, like, in a day, they don't. It's not worth.
[00:30:14] Speaker B: Right. Maybe that's, like, the cutoff day or so they're going to spend their time in the. In the county hotel.
[00:30:21] Speaker A: Yeah.
[00:30:21] Speaker B: And then the other two dudes are going to the state prison.
[00:30:24] Speaker A: Or do you think maybe if it's like, if the cutoff is a year, let's just say maybe they said, no, no, no. You need to go to federal prison as well.
[00:30:29] Speaker B: So we're giving you get an extra.
[00:30:31] Speaker A: Day just to push you over the.
[00:30:33] Speaker B: Edge, you to steal.
And you are going to have fun there.
[00:30:39] Speaker A: He's gonna do his time either way.
[00:30:41] Speaker B: Yeah.
[00:30:41] Speaker A: Dusty o. Pierce, which is.
[00:30:43] Speaker B: That's a stupid name.
[00:30:47] Speaker A: I wasn't gonna say it.
[00:30:48] Speaker B: I say stupid. It sounds made up.
[00:30:50] Speaker A: Yeah. Dusty o Pierce.
[00:30:52] Speaker B: Those poor people out there going, my name is Dusty Pierce.
[00:30:54] Speaker A: He sounds like a prospector.
[00:30:55] Speaker B: Yeah.
[00:30:55] Speaker A: Or maybe it might be a woman, actually. I don't know.
[00:30:57] Speaker B: What are you doing in my.
You claim jumper.
[00:31:00] Speaker A: I'm Dusty o. Pierce. And these 88 million are mine.
[00:31:03] Speaker B: That's right. Who are, well, of the law offices of Dusty Owen Pierce.
[00:31:08] Speaker A: Yes, absolutely. Well, they are going to do their time for that license piracy scheme. So crime doesn't pay. Crime doesn't pay.
[00:31:16] Speaker B: I hope not.
[00:31:16] Speaker A: In other news, and this is also kind of a callback to something we talked about a couple weeks ago. You may remember we discussed a man who was in some trouble for calling in some death threats to one Nintendo. That japanese man is getting jailed for repeated death and bomb threats that saw multiple Nintendo events canceled. And the judge excoriates a self. Excoriates.
[00:31:36] Speaker B: That's a fun word.
[00:31:37] Speaker A: I was gonna say. I've not seen that word in a long time. Selfish. Motive and persistence. And vicious. Persistent and vicious behavior. So this guy, if you don't remember, was really mad that he kept losing. I think it's splatoon. And every time he would lose, he would get angry and angrier. And he started calling, like, Nintendo or started submitting forms to Nintendo, like, complaint forms, basically saying, like, like, threatening them, threatening events and things like that, and didn't follow through on any of them, thank goodness. But they ended up having to cancel a ton of their events. Like, they had an event beginning of the year that was supposed to happen. Some kind of, like, not Nintendo direct, but something like that. They had to cancel it for the safety of. They couldn't take the chance. Right.
[00:32:09] Speaker B: Yeah.
[00:32:10] Speaker A: So they argued that, like, we lost all this money because we couldn't take the chance. Way to cancel all these events because you're calling in these bogus threats and definitely some.
[00:32:18] Speaker B: Some weight behind that statement.
[00:32:19] Speaker A: Yeah, yeah, absolutely. So Takemi Kazama, I think, is how that's pronounced. He's a 27 year old dude. He's guilty of intimidation of business by repeatedly posting threats of harm. And he's gonna serve some time.
[00:32:29] Speaker B: Now he has a judge excoriating him. How often do you get to say the word excoriate? Speaking of home and words you don't say often.
Cloaca. Right.
Was on. It was used in the X Files. I was like, that's not something you hear every day.
[00:32:50] Speaker A: Did you say something earlier this week or last week about trying to put that in an episode?
[00:32:56] Speaker B: No, I didn't see it until, like, this week.
[00:32:58] Speaker A: I was gonna say. I thought somebody. I heard somebody say that earlier this week. Wow. Wow.
[00:33:01] Speaker B: Yeah. Fun.
[00:33:02] Speaker A: That's.
Oh, somebody did. Yeah.
[00:33:06] Speaker B: Did I?
[00:33:06] Speaker A: Apparently you just talk about it so often.
[00:33:09] Speaker B: Listen, huh? Yeah. Right? We're seeing words that we don't see all the time. Makes me think of patterns. It's funny.
[00:33:17] Speaker A: There's a conspiracy. The big government is trying to educate us. They will not prevail.
[00:33:22] Speaker B: That's right.
[00:33:23] Speaker A: So in this case, the judge said that there's nothing to be said for such a selfish motive. The crime was persistent and vicious. Sentenced him to one year in prison, suspended for four years. I'm not, like, super well educated on, like, the law and stuff. This is obviously over in Japan. So what does that mean when you're sentenced to one year in prison, suspended for four years? Is that, like, when you redshirt an athlete and they, like, are there, but they can't play, so they get, like, an extra year of eligibility?
[00:33:48] Speaker B: So that's a good question. I don't know that I'm checked out on this either, as far as, like. So you're saying that he got a year in prison, but the prison sentence was suspended for four years.
[00:33:59] Speaker A: It's okay.
[00:34:00] Speaker B: Four years, does he go to prison or.
[00:34:02] Speaker A: Huh? Okay, so if somebody's handed a five year sentence with three years suspended, they would serve two years in prison and then be released facing the conditions. So we're giving you time to prove yourself, basically, time outside, like, with conditions that, okay, you're gonna serve two years of this time. If you can do this time outside, then you're good. If you mess up again during the time that we let you out, you're back. So one year sentence, four years suspended. So he'll go to prison for a year.
[00:34:28] Speaker B: That's such a weird way to say that.
[00:34:29] Speaker A: Yeah. Like, why not just say you're going to jail for a year?
[00:34:31] Speaker B: It usually means you're not going to jail. Right. We're suspending the time. Like, suspended time needs to be held in a limbo state.
[00:34:40] Speaker A: There's going to be some lawyer that stumbles across this episode by chance and is going to be like, these idiots.
[00:34:45] Speaker B: Yes.
Not a lawyer.
[00:34:49] Speaker A: And they're going to be like, by the way, that dusty o and Pierce joke. Not funny. Not funny.
[00:34:53] Speaker B: Yeah, it's Dewey Cheetahman. How?
[00:34:55] Speaker A: Right, well, I just saw that this guy was officially sentenced and wanted to provide an update as kind of a secondary behind bar.
[00:35:03] Speaker B: Secondary what japanese prison's like.
[00:35:04] Speaker A: Yeah.
[00:35:05] Speaker B: You don't hear people telling horror stories about japanese prisons.
[00:35:08] Speaker A: No, because, like, everything, like japanese subways are super nice and clean because of, like, the social norms there and stuff and the laws that are in place. So I wonder if prison is the same way where it's like a lot. Not higher class, but like a lot nicer. I guess. It's still prison.
[00:35:21] Speaker B: I would assume in Japan they're very, you know, they're very neat and organ together. There's just like the culture of that there. So I would assume that, yeah, it's probably pretty swanky.
[00:35:32] Speaker A: You're gonna do time anywhere.
[00:35:33] Speaker B: You know, compared to, like, San Quentin.
[00:35:35] Speaker A: I was gonna say it's better than a gulag.
[00:35:37] Speaker B: Indeed, indeed, indeed. That's a place you don't want to find yourself.
[00:35:41] Speaker A: No, no, trust me, I've played call of duty.
[00:35:43] Speaker B: I know you were in a gulag. In call of duty.
[00:35:45] Speaker A: Yeah. I think it's warzone that you, like can end up. If you die, you go to the gulag and you have to fight somebody in the gulag. And if you win, you get respond onto the map. Yeah, I think it's war zone. Christian might be able to correct me on that, but yeah, you fight somebody in the gulag, it is it is war zone.
[00:35:59] Speaker B: I saw some movie with Ed Harris and Colin Farrell.
[00:36:05] Speaker A: Okay.
[00:36:06] Speaker B: And they were in a gulag in Siberia and they escaped. I want to say it was a true story.
[00:36:12] Speaker A: The way back.
[00:36:13] Speaker B: The way back. Yeah. Crazy flick, man. Them escaping and them going. So they, they basically escape from Siberia, go through Russia into, like, mongolia and across the Gobi desert and.
[00:36:25] Speaker A: Yeah.
[00:36:26] Speaker B: Crazy flick.
[00:36:28] Speaker A: Okay. Interesting.
[00:36:29] Speaker B: Yeah, it was pretty good, though.
[00:36:30] Speaker A: It's pretty good.
[00:36:31] Speaker B: I enjoyed it.
[00:36:32] Speaker A: Put it on the list.
[00:36:33] Speaker B: Yeah.
[00:36:33] Speaker A: It's got Saoirse Ronan in it. Interesting. Saoirse Ronan, she was in Lady Bird. She was in the new adaptation of little Women and saw Lady Bird.
[00:36:41] Speaker B: Who'd she playing? Lady Bird.
[00:36:43] Speaker A: She was the main girl. She was the dog.
[00:36:44] Speaker B: Oh, yeah. Okay.
[00:36:46] Speaker A: The pink hair, I think.
[00:36:46] Speaker B: Yeah, yeah, yeah.
[00:36:47] Speaker A: I. So she was, she was the daughter. She's good. I like her.
[00:36:49] Speaker B: It's been a while since I've seen that movie.
[00:36:50] Speaker A: Yeah, I think I only watched it once. It's decent. Yeah. That is gonna do it for the first half of Technato. We're gonna take a break and I'm gonna go get some water, but we've got more on Kaspersky and Crowdstrike and all kinds of fun stuff, so don't go away. More technato coming up after the break. Anthony, what are we gonna be talking about?
[00:37:07] Speaker B: We are talking about our newest and most excellent cloud plus course. This course really does an amazing job of taking the learner from the very.
[00:37:19] Speaker A: Fundamental aspects of cloud and then walking.
[00:37:23] Speaker B: Them through some of the more advanced topics. They're going to learn about how to secure the cloud, how to optimize the cloud, how to save costs with the cloud. So this is not a course with complete bias to AWS or Google Cloud platform or Microsoft Azure. We breathe life into this material by doing demonstrations across all of the big three cloud vendors. We have a lot of fun in.
[00:37:53] Speaker A: Cloud, and we know that you will, too. So come check it out.
Welcome back to Technado. Thank you for sticking with us through that break. It was a fun break for us.
[00:38:07] Speaker B: I don't think you would have stuck with us since, you know, part of the conversations I did between, I got.
[00:38:13] Speaker A: My drink of water and then I came back and I learned about cryptids. So it was great.
[00:38:17] Speaker B: It was a good time.
[00:38:18] Speaker A: It was great. I learned so much scientific information, but moving on from that, not that it wasn't a lovely conversation. That's not what you're here for. No, I promise.
[00:38:27] Speaker B: They would be if they heard it. I totally, yeah.
[00:38:29] Speaker A: We'll do Technido.
[00:38:30] Speaker B: Yeah. Techno red.
[00:38:32] Speaker A: That's the next one. The lights are off to protect our identities.
[00:38:34] Speaker B: Yeah. Yeah.
[00:38:35] Speaker A: I did promise you earlier in the episode some Kaspersky. So this is Deja news.
[00:38:41] Speaker B: Deja news.
Oh, man. Y'all cannot hear the masterful rendition of deja vu that just came over the loudspeaker.
[00:38:58] Speaker A: We can't hear the music in here. So our director gives us.
[00:39:02] Speaker B: I thought it was Beyonce.
[00:39:03] Speaker A: It was beautiful.
[00:39:04] Speaker B: I thought it was.
[00:39:05] Speaker A: It was.
[00:39:06] Speaker B: That's how close of a truly beautiful.
[00:39:07] Speaker A: I thought I heard Jay Z ready to come in. So thank you, Christian, for that. Really do appreciate it. But on to the news, as promised, Kaspersky says Uncle Sam snubbed their proposal to open up its code for third party review. So, Kaspersky, I think you said, like, when we talked about this last time.
[00:39:26] Speaker B: I think it was the first time we talked about Kaspersky kind of getting the, you know, y'all get on out of here now. Get from the US government that I'm like, why couldn't they just do, like, a trusted foundry thing? You're having trouble, aren't you? I love watching her just like, get it out of my head. It's no longer there, but it is still there, which is entertaining for you and me.
[00:39:48] Speaker A: It's still there. Yeah. I'm sorry. You can take it.
[00:39:50] Speaker B: Oh, it's funny, I get myself under control, but yeah, if you're not familiar with trusted foundry, that's where if you have like, a hardware device and you want it to be a part of, you know, the US government, they will take a look at it and before it is allowed and in very scrutinizing fashion. And I thought, why couldn't they do this with Kaspersky stuff?
Again, I got no skin of the game. I just want to make sure that that is out there in the open when it comes to skis. Kaspersky. Kaspersky.
When it comes to them, I have no skin in the game as far as, like, I'm not a defender or detractor of Kaspersky. If they've done stupid, bad stuff, that is a detriment to the health and safety of the US than to hell with them if they are not, though. And I will also like to applaud and thank the commenters out there. A lot of people have talked about this in the comment section. I would champion more. Please continue what, you know, put it in the comment section so that we can all have this kind of continual conversation, become aware of the, the bits and pieces. It's really hard to kind of bring everything together. What's truth? What's. What's fact? What's fiction?
So yeah, it's really helpful for people out there that know that. Some people said something about how Kaspersky was busted on a home laptop where they were stealing NSA secrets. Oh, yeah, right. If that's true, bad Kaspersky. And that makes us not want you in the country. But going back to our article, kind of playing catch up here, they have said, hey, we reached out to the US. We actually do this in other governments and other countries where we say, you know what, we will give you the source code. We will open up to you to scrutinize as you will. We will have no control over that. At least that's what they're saying. How much truth is in that, I don't know, but it is interesting. As far as the conversation goes, it does seem that Kaspersky is willing to play ball in a lot of ways to say we are transparent as a pane of glass.
Come on down, take a look at our code. We have nothing to hide.
At least that's what they're saying.
And you find something interesting over there?
[00:42:06] Speaker A: I'm looking at the comments. Oh, and of course everybody's looking at it. Not our comments, but the comments from the article.
[00:42:11] Speaker B: Yeah, yeah, yeah.
[00:42:12] Speaker A: Of course. They're like they don't even need. Why do they need us government approval? Sorry, I'm not talking to the mic. Why do they need us government approval? If he wants to open his code to third party review, he can just do it. If he wants to post source code on GitHub, he can. He's waffling. He's trying to rattle Washington.
[00:42:25] Speaker B: I think they were saying more along the lines of you set up the thing that you will trust that you.
[00:42:31] Speaker A: Right, exactly. And that's what people are saying.
[00:42:33] Speaker B: And a, they don't want to open their code to the world. They don't want to make it an open source project.
[00:42:37] Speaker A: Right.
[00:42:38] Speaker B: What they're saying is I want to be able to open the code to the governments specifically to allow them to vet and verify that our code has nothing sneaky and never will have anything sneaky because you always have access to the updates into the new code and everything that goes into our product so that you can maintain the idea.
If this is all true, taking it at face value, I think that's a big step. That's huge for some organization to say. I'm going to let you just scrutinize my entire code. And the fact that apparently they're doing it with other countries as well. It's like this ain't our first rodeo. We do it in Brazil and Saudi Arabia, I believe.
[00:43:15] Speaker A: Yeah, right.
[00:43:16] Speaker B: Because, well, I don't know anything about Brazil's government, but Saudi Arabia has a much more tightly controlled governmental entity.
So they were willing to open up for them and they were willing to open up for you. And correct me if I'm wrong, in the article, they also say that they reached out to the us government, to crickets. They received crickets in return.
[00:43:38] Speaker A: Just snubbed.
[00:43:40] Speaker B: Yeah, snubbed was the word the article used. Now, it does seem the register is a bit on their side.
And I not a huge fan when not, you know, I don't want to cast any aspersions on the register necessarily, but just the fact that if you are a news outlet, just give me the facts. I don't need your op eds. Right. I don't need your opinions in the facts. Just give me the facts. If Kaspersky is doing this, then cite sources.
Give me the information, allow me to form my own opinion on these things. But I just thought it was worth, you know, the conversation continues. It does seem though that Kaspersky is willing to play ball with the us government. I would, I would actually like to see some response coming from the us government one way or the other.
[00:44:27] Speaker A: Yeah. What's your reasoning like? We're not accepting it.
[00:44:29] Speaker B: We think this is horse crap or you're a bunch of lying scumbags. Whatever. I don't know.
[00:44:34] Speaker A: Yeah.
[00:44:34] Speaker B: I want to know what the response is from the US and why they would say no. That's not on the table for X, Y and Z reasons. Again, I like to be an informed person.
[00:44:45] Speaker A: Sure.
[00:44:46] Speaker B: Hopefully you do as well. And that's why I encourage, if you know something about this, throw that in the comments section. For the rest of us that are low information on this, I want more information.
[00:44:56] Speaker A: That almost sounded like the cadence of Arnold Schwarzenegger the way you said that.
[00:44:59] Speaker B: Oh, really?
[00:44:59] Speaker A: That did. The comments for the rest of us. The way that you.
[00:45:01] Speaker B: The comments for the rest of us.
Your leadership skills.
[00:45:05] Speaker A: Do it now.
[00:45:06] Speaker B: Come on.
[00:45:07] Speaker A: I'm here. Do it now.
[00:45:08] Speaker B: Do it now. Come on.
[00:45:12] Speaker A: That's Kaspersky to the chopper.
Oh man. Thank you for that. I needed that.
[00:45:19] Speaker B: That was beautiful.
[00:45:20] Speaker A: I did finally calm myself. It was just Christian's lovely Beyonce impression that was causing me.
[00:45:24] Speaker B: I had to. I had to bring my game, right.
[00:45:26] Speaker A: Cause me to lose my focus. So I do appreciate you giving me some time.
[00:45:29] Speaker B: You said nothing. You were over there just processing.
[00:45:31] Speaker A: I was really trying to hold it together. I don't know. It's. Cuz it was like bass boosted through the speakers and then he started laughing and it just. It just got. It just gave me a giggle. Christian is always good at giving us.
[00:45:41] Speaker B: A good for a chuckle.
[00:45:42] Speaker A: Funny guy.
[00:45:43] Speaker B: Yes, indeed.
[00:45:43] Speaker A: Funny how he muses me. Yeah, he makes me laugh. I'm just kidding. Christian.
[00:45:48] Speaker B: I hate this. You can't really finish that whole monologue, you know, because it's.
[00:45:52] Speaker A: He's a big boy. What did he say? Yeah, I can't.
[00:45:54] Speaker B: He knows what he said.
[00:45:55] Speaker A: Can't. Really. Can't really.
[00:45:57] Speaker B: Yeah.
[00:45:57] Speaker A: Again, we'll do Technido after dark and it'll be uncensored. Yeah, I'm just kidding. We're not gonna do that. Yeah, we're gonna start getting.
[00:46:02] Speaker B: What are you doing after dark? Give me the. Give me the after dark Technato.
[00:46:06] Speaker A: Except my grandma, she'll be like, no thank you.
[00:46:08] Speaker B: Yeah, that is not. Lady, lady.
[00:46:09] Speaker A: That is not a good idea. Grandma was like, grandma's in the comments like almost every week. She's a supporter. Appreciate that.
[00:46:14] Speaker B: She's a ride or die.
[00:46:15] Speaker A: She's a ride or die. She is. Love you, grandma. Thanks for that. I'm not gonna give away her name though. That secret identity. We'll move on. Just wanted to give that update on Kaspersky. So thank you for that. Daniel. Video game voice actors are officially on strike over AI. We refuse this paradigm, which is a little dramatic, but I'm.
[00:46:32] Speaker B: What are we in translate?
[00:46:33] Speaker A: We refuse this paradigm tragedy today.
[00:46:37] Speaker B: Jared. Boy, they eaten my wolves. And you don't want to know how that story went because you see, when we had a conversation in between segments, it went down that road and it got.
[00:46:50] Speaker A: It did. It got dark, got very scientific, but funny. I was a little scared.
[00:46:53] Speaker B: See, I said, but funny.
[00:46:57] Speaker A: We're gonna get flagged.
[00:46:58] Speaker B: I'm having fun now.
[00:47:00] Speaker A: He's gonna spend the rest of the episode trying to give me a break again. Yeah, I'm not gonna do it. So video game voice actors are on strike. SAG AFTrA actors will refuse to work for activision, EA, take two, and a bunch of other major developers because they want to make sure that they're not going to be taken advantage of in the way of having their voice or their likeness in that way replicated by AI. Interesting little side part to this is that this doesn't apply to games that are already in development. So a lot of the voice actors working on GTA six, which we've been waiting for for 17 years, are a.
[00:47:32] Speaker B: Little while longer, right? Yeah, we waited this long.
[00:47:35] Speaker A: It's. There's. There's like. It's like a whole joke now. Like, we got blank before GTA six. Like, it's just people waiting for GTA.
[00:47:41] Speaker B: Three and be happy with your life. Right.
[00:47:43] Speaker A: GTA five, if you. If you prefer. But they're upset because I guess they.
[00:47:47] Speaker B: What's your favorite GTA?
[00:47:49] Speaker A: I don't play it. I watch it. I watch people play it. Yeah.
[00:47:52] Speaker B: Interesting.
[00:47:52] Speaker A: Specifically, like, I've. I've watched my brother play it at some point.
[00:47:55] Speaker B: Gotcha.
[00:47:55] Speaker A: Because it's. It's a little.
It's a little violent.
[00:47:59] Speaker B: It is a little violent. But for me, I love the game, the open world of it.
[00:48:05] Speaker A: Oh, sure.
[00:48:05] Speaker B: Being able to just absolutely drive with reckless abandon.
[00:48:09] Speaker A: Yeah.
[00:48:10] Speaker B: As fast as you can. Just do stupid stuff with your car. And then, of course, you get all the stars, and then here comes the cops and everything, and they send the National Guard after you after a while, like, if you get five stars, probably there was a cheat code, and I want to say it was GTA three, where you could get the tank and drive around in the tank.
[00:48:29] Speaker A: That sounds right. That sounds like a GTA three.
[00:48:31] Speaker B: That sounds super fun. It was stupid fun.
[00:48:34] Speaker A: The same developers, Rockstar, made another game called Eleanor, which is one of my favorites. And it's similar mechanics in that it's pretty open world. But you're like a detective in the 1940s, and so you're doing good. He does.
[00:48:44] Speaker B: That's cool.
[00:48:45] Speaker A: Spoiler. He cheats on his wife. So that is not good.
[00:48:48] Speaker B: But let's not do that.
[00:48:49] Speaker A: He does a lot of good throughout the. So you're playing as him having to solve these mysteries and, like, interview people. He uses the motion capture. You have to, like, read their facial expressions.
[00:48:56] Speaker B: He's a flawed character.
[00:48:57] Speaker A: Obviously he is. It makes him like you really like him, and then it's like, oh, I don't like you anymore. Cause you cheated on your wife. And then, spoiler, he dies at the end because he, like, sacrifices himself. So, anyway, I like that game because it's similar mechanics, I think.
[00:49:09] Speaker B: Did Max Payne. Was that from a rocks. Was that rockstar?
[00:49:11] Speaker A: What was it called?
[00:49:12] Speaker B: Max Payne.
[00:49:12] Speaker A: Max Payne. Oh, I don't know.
[00:49:13] Speaker B: You don't know Max Payne?
[00:49:15] Speaker A: I don't think so.
[00:49:16] Speaker B: Oh, my goodness.
[00:49:17] Speaker A: Is he related to Major Payne? It is Rockstar.
[00:49:19] Speaker B: Yeah. That is one of the best games ever created. Oh, it's not the second 1 second.
[00:49:26] Speaker A: One was neo noir. That looks like something I would enjoy.
[00:49:30] Speaker B: Whoa.
Max Payne is literally easily top five games of all time.
[00:49:38] Speaker A: I didn't expect you to be so passionate about this.
[00:49:39] Speaker B: Yeah, come on, man.
[00:49:40] Speaker A: This looks like something I wouldn't do.
[00:49:42] Speaker B: It was such a game changer to turn a phrase, right? Like, yeah, with the bullet time physics. It's on iOS, and, bro, you. You're gonna lose your mind. When I got that game, I did not emerge from my cocoon until it was beaten.
[00:49:59] Speaker A: Huh.
[00:49:59] Speaker B: I spent so much time playing Max Payne.
[00:50:02] Speaker A: Interesting. I mean, it came out 23 years ago, so I wonder if I could play it so freaky.
[00:50:07] Speaker B: Like, it had the whole noir thing going on. It was very gritty.
Like, you'll. You'll love it. You will absolutely love that game.
[00:50:17] Speaker A: Interesting. Maybe it'll be a good alternative for me to. A good alternative to GTA.
[00:50:21] Speaker B: Yeah.
[00:50:21] Speaker A: Lego city undercover is also good for that because it's open world, but you're like, it's Lego, so it's not violent. Anyway, I got way off topic, but I appreciate the recommendation. I'll have to look into that.
[00:50:31] Speaker B: Yeah. Yeah.
[00:50:32] Speaker A: So it applies to SAG AFTRA actors because obviously they are part of that union. And the thing that's interesting to me about this is that obviously it's the union that's going on strike. But there are voice actors that do work for companies like this that are not part of any kind of union. Florida is a right to work state. I'm a voice actor. I can do voice jobs and not be part of the union. Now, for acting jobs, there's some kind of rule about SAG AFTRA where you can do one as a non union worker, and then you become SAG eligible, and then after your second job, that's a union job. You have to, like, join the union or else they won't let you do any more union work. You still do non union stuff, but I just wonder if there are enough folks that are non union that will continue to do the work because they're not really part of this, or if there's enough people that agree with this overall goal of, like, hey, they say, I stuff's getting out of control. We need some protections in place to protect our likeness and our voice that even folks that aren't really part of the strike are willing to, like, participate and boycott almost to a degree. But curious to know what y'all think about this, because I think that when people talk about, like, name, image, and likeness, sometimes people forget that a voice is a part of that. And Daniel brought up a good point. We were talking about this earlier. I about, like, what does that mean for impersonators if, like Nancy Cartwright, the voice of Bart Simpson. Right, right. If she has, like, protections in place that you can't use AI to emulate my voice for Simpsons episodes, what if I do a really good Bart Simpson impression?
[00:51:53] Speaker B: So what, that's an interesting, like, take on it, though, because the Bart Simpson voice is not her voice. That is a voice she puts on.
[00:52:04] Speaker A: Right.
[00:52:05] Speaker B: As a voice she created for the Simpsons.
[00:52:08] Speaker A: That's true. Yeah.
[00:52:09] Speaker B: Right. So I could see a judge and a jury or whoever going, hey, that's really a character voice, and the character is owned by Fox.
[00:52:18] Speaker A: True. So they can own by whatever.
[00:52:20] Speaker B: Whoever. Right. Yeah. Whereas if it's just Nancy's voice, that's a different story.
[00:52:26] Speaker A: Yeah.
[00:52:27] Speaker B: So Nancy owns Nancy's voice. I believe there are actual, like, likenesses, clauses and stuff in a lot of contracts saying, oh, we don't own you, we don't own blah, blah. But if you create a character for us, then that character is owned. Now where that gets weird is you're obviously playing a character if you're in a movie or whatever.
[00:52:47] Speaker A: Right.
[00:52:47] Speaker B: And that character is owned by whatever movie house or television show or whatever.
[00:52:53] Speaker A: The case studios or whatever. Yeah.
[00:52:55] Speaker B: But since you're using your own voice, how much of that likeness do they own?
[00:52:59] Speaker A: Yeah. And can they use it?
[00:53:01] Speaker B: That's where it gets weird. Right.
[00:53:03] Speaker A: Because, like, working here, I know there's a certain, like, if I'm in videos on ACI learning.com and it's in our library, like, I can't say, well, that's me. You need to take it down because I don't work there anymore. Like, they own that. That's their content that they own. Does that also apply to my voice? Can they take, like, clips of me talking?
[00:53:17] Speaker B: What if they, let's say they take the character. Right. Because obviously, if you played a character, sure. That people know it's you, but you kind of part ways with them, and they started creating a character based off of that character that you did that looks a lot like you, sounds a lot like you, maybe is not quite you, and they're doing things that you don't agree with.
[00:53:36] Speaker A: Yeah.
[00:53:37] Speaker B: Right.
[00:53:38] Speaker A: Where's the line?
[00:53:38] Speaker B: Where's the line? Seems. Seems like we got a lot of work to do when it comes to I need to be able to own me and what I, my likeness does and does not do.
[00:53:48] Speaker A: Right.
[00:53:49] Speaker B: And does and does not say. So I'm kind of, on one hand, it does seem like I would be on their side on this. I don't like the fact that they got the communist fist as their.
[00:53:59] Speaker A: Yeah.
[00:53:59] Speaker B: Logo.
[00:54:00] Speaker A: That is interesting.
[00:54:00] Speaker B: Ill red salute is not my, not a communist.
But I do believe that they, I get what theyre saying, and to probably a great extent, because, like you said, we work in this space where our likenesses are used.
We are owned in contract to do x, y, and z. Like, if I make a course for ACI learning, they own that course that is theirs. But if I part ways with them, I'm now. I should be relinquished from my contract, and I can now start creating that same, similar content.
[00:54:36] Speaker A: Like, you couldn't take stuff that was already in the library.
[00:54:38] Speaker B: Right, right.
[00:54:38] Speaker A: And luckily, I think we're in a situation to where, like, the content that we create, I couldn't see them really having a. There would be no benefit to them being like, well, we're gonna take your voice and you. For what? To what end? Because you would still need my appearance, right, to create these videos where we're teaching something. Unless they were hoping to use it in, like, a voiceover sense. Like, in that case, I might have something to worry about.
[00:54:57] Speaker B: Impersonations. If I can do a good impersonation. Is that infringing?
[00:55:02] Speaker A: Yeah.
[00:55:02] Speaker B: Right. Is now all the impersonators out there out of business because they're stealing someone's likeness?
[00:55:08] Speaker A: In a way, Dan Carvey's in trouble.
[00:55:11] Speaker B: Seems like there's a lot of work that needs to be done, or if it is done, you know, made more aware to the people so that they know if you get into this business, this. This is the reality of it.
[00:55:21] Speaker A: Read your contracts. Yeah. When you sign on to do something.
[00:55:23] Speaker B: Called lawyers, ladies and gentlemen, get one.
[00:55:26] Speaker A: Because there's stuff, there's voice over jobs that maybe you're not as worried about it. If I'm doing, like, a. It's a mom and pop shop that needs, like, a voicemail. You know, we need. We need you to say, like, oh, press one to hear our menu. Press two to da da da da. I'm not gonna be as worried about, like, granny down the street.
[00:55:39] Speaker B: But what if that becomes a meme, right? What if, for whatever reason, like, I bet. I bet half of our viewers could sing the hold music.
[00:55:48] Speaker A: Oh, yeah. Yeah.
[00:55:50] Speaker B: You know what I see? I didn't even sing it.
[00:55:53] Speaker A: It's good.
[00:55:53] Speaker B: Knows exactly what I'm talking about.
[00:55:55] Speaker A: Good music kind of goes hard.
Yeah, it's good.
[00:55:59] Speaker B: I'm like, yep.
[00:56:02] Speaker A: Getting down. Just leave me on hold, son. Yeah, at that point, I hate my.
[00:56:06] Speaker B: Best part.
[00:56:10] Speaker A: At that point, like, I hate when they interrupt to be like, your call is important to us. It's like, yeah, I know.
Let me listen. Don't interrupt me until you're ready for me.
[00:56:17] Speaker B: That's right.
[00:56:17] Speaker A: But you're right. Like, I guess in that case, you can never be too careful. You have to, I guess, pick your battles in that case. But especially with a big company like the woman that did the Siri voice and how she ended up having to fight some battles surrounding that. If you're doing stuff for Disney, like.
[00:56:30] Speaker B: Oh, yeah, House of mouse, they like to own everything. Like, ip is a big deal for a lot of these things.
[00:56:35] Speaker A: Mine now. Yeah, get in trouble for that now.
[00:56:37] Speaker B: You get a cease and desist.
[00:56:40] Speaker A: They're gonna be like, we own. Thank you.
[00:56:42] Speaker B: Whoa, where's my royalty for that? We didn't make any money off of this.
[00:56:47] Speaker A: I think. I think for long term video game roles, it would be difficult to do it entirely generatively AI, because there are still, if you listen to it for long enough, it's like you can kind of tell, but especially for just little quips, like little NPC lines. Super easy to just, hey, watch it.
[00:57:02] Speaker B: So why should, I mean, I guess the question becomes why shouldn't the game companies be allowed to use AI generated stuff?
What's the argument on that?
I guess if you're generating, if you're okay with some of it, let me go down the road. How much and why it seems arbitrary that you chose. Here is the line and not here. You gotta give some good reasoning objectively. Why, this is why you should go this far with AI and no farther and then employ x, Y and Z people.
Yeah, that's, that's another conundrum and a.
[00:57:38] Speaker A: Question I guess it would have, you would almost have to just, it would be in a contract, right. Like we reserve the right to do x, y and z with voice. And as an individual, you just have to decide if you're willing.
[00:57:47] Speaker B: Speaker one. Well, is that what their problem is? Is the problem that they're using AI or is that they're using AI to mimic the voice actors?
[00:57:55] Speaker A: The actors are striking for fair compensation and the right of informed consent for AI use of faces as well as voice bodies.
[00:58:03] Speaker B: Okay, so it is all about using AI to recreate their.
[00:58:06] Speaker A: There, their name, moves, likeness or whatever.
[00:58:08] Speaker B: Yeah, that's, that's a problem.
[00:58:10] Speaker A: Yeah. So it's just as long as they, I guess, have some kind of a meaningful agreement that everybody understands. And they're compensated for.
[00:58:16] Speaker B: Right, right. Makes sense that I took, like, come on. That's. That's a simple thing. I'm gonna create a game. We created a character that's based off of your likeness, your voice, your face, your body, so on and so forth. In that contract, it should just say you. You get compensated in perpetuity for every time we use this character. Yep, the end. And you get x percentage or x amounts, whatever you guys agree on. I mean, it just seems that simple. I mean, maybe it is just an overreaching, profiteering glutton that just wants to not pay.
[00:58:51] Speaker A: Yeah, I guess if you can get away with using it for free, why would you pay? In their estimation. But they've been in negotiation for over a year on this, and finally they've decided we're not getting anywhere going on strike. So be interesting to see where this goes. And to those GTA six voice actors that this. They're not. They're not part of this. Hope. Hope your work goes well. We'll pray for you.
I mentioned. I mean, I. You know, I would suck. It was stuck to be in that position. I would hate to be in that position. We mentioned earlier that we might be able to potentially convince our director, Christian, to put our boot loops graphic back up on the wall for those of you that missed it last week, because we do have a small update on the crowdstrike situation. So I don't know if that's something we'll be able to put up. But in the meantime, Crowdstrike is offering a $10 apology gift card to say sorry for the outage. So this applied to, like, their partners and things that were affected by that big outage. And, you know, there it is. In case you didn't hear about it, there it is. There's the graphic. You might have heard about a little outage that happened last week concerning Crowdstrike. They offered its partners a crowd streak. Offered its partners a $10 Uber Eats gift. Cardinal. It's an apology for a late night snack or coffee.
The issue that partially people were like, okay, a $10 Uber Eats gift card that will cover the delivery fees that Uber eats charges me to order a minimum of $10 of stuff because you can't order just, like, a $5 snack on your breeze. It's like a minimum of $9 or something.
[01:00:12] Speaker B: So it's like, yeah, it's a delivery guy with.
[01:00:14] Speaker A: Yeah, you got to tip him, whoever. Yeah, certain amount, or else they're not going to bring you your food or they'll spit in it or whatever. There are people that, like, film themselves.
[01:00:21] Speaker B: I'm not saying that that doesn't happen. It's just funny. That's where we go in our minds.
[01:00:25] Speaker A: There's, like, videos of people being like, immediate vindication.
Like, I have your food. I saw you didn't tip. Were you gonna tip in cash? And they're like, huh? What? And then it's like, well, even if I was gonna tip in cash, like, dang, you're kind of being a jerk about this. Like, what are you gonna do, withhold my food anyway? So, point being, a lot of people were like, $10 gift card. Thank you for this. What am I gonna do with this? But on top of that, when people went to redeem the code, they were getting an error message that said the gift was canceled by the issuing party and is no longer valid. So as a result of an outage that resulted in a lot of things not being available, they got a gift card as a sorry gift that was also not available that resulted in an error.
So kind of a sucky situation all around.
[01:01:04] Speaker B: Crowdstrikes just, you know, the hits just keep on coming.
[01:01:07] Speaker A: They just can't catch a break. They really can't.
[01:01:09] Speaker B: They really can't. Unfortunately for them. I mean, are they, are they giving out? And maybe you said this, and I just missed it. So let's say I'm XYZ Corp, and I had 2000 devices that were downed because of this whole Crowdstrike debacle. Do I get $2,010 Uber Eats gift cards?
[01:01:32] Speaker A: That is a good question. It says it was sent to their partners and some of the people who tried to redeem it, it didn't work.
But it doesn't say whether it, like, we're going to send you a gift card to each of your employees, or we're sending you a gift card for each device that was affected. It. It just says, we send our heartfelt thanks and apologies for the inconvenience, gives a reason for the delay, and, like, things that were impacted, and then, oh, many of you have been proactive in assisting your customers with recovery, and we want to ensure you we have access to the latest information and tools, uh, to express our gratitude. Your next cup of coffee is on us. But that's.
[01:02:06] Speaker B: That's weird, though. It's like, yeah, who the hell gets that? Who is it that gets it? Is it people that are proactively being helpful, like you randomly if they saw you on the interwebs, being like, hey, here's how to help your crowdstrike. Issue.
[01:02:20] Speaker A: Like, is it the IT teams at these companies, or is it just, like, the CSO gets one and that's it? Like, right.
[01:02:26] Speaker B: Like, this is weird. This is an odd.
This is an odd thing, huh? With very little detail on how this works.
[01:02:35] Speaker A: Yeah, it just says partners.
[01:02:37] Speaker B: We send this to our teammates and partners who have been helping customers throughout the situation.
[01:02:41] Speaker A: Yeah, not customers, but partners. So.
[01:02:43] Speaker B: And then they totally got hosed because Uber was like, this is fraud.
[01:02:47] Speaker A: Yes. Uber. The reason that they kept coming up as an error is because Uber flagged it as fraud because of the high usage rates. So it just. Crowdstrike cannot catch a break. And I don't know that there was any, like, oh, we'll reissue the code. I don't know if they.
[01:03:01] Speaker B: I'm sure there are very few crowdstrike individual users out there. It's mostly corporate entities that are used as uses. Crowdstrike. So I would assume they're sending these $10 gift cards to these corporate people.
[01:03:20] Speaker A: Yeah, but that's a good question. Like, if it's. If there was a team of ten in this corporation that was working on this stuff.
[01:03:26] Speaker B: And I love about Techcrunch, the detail and information they give you in your. I'm looking at, like, other ambiguous at.
[01:03:32] Speaker A: All, and, like, other sources, and it just says partners. Partners. Partners define partners. Right. Yeah, they were. Crowdstrike did not send gift cards to customers or clients. We sent these to our teammates and partners who have been helping customers through this situation. But other than that, no specifics. So if you happen to know, for whatever reason, maybe you work at Crowdstrike.
[01:03:53] Speaker B: Let us know, jump in the comments.
[01:03:55] Speaker A: And if you do, we are so sorry, and we hope.
[01:03:57] Speaker B: Yeah, we know that you're having, like, a real piss poor.
[01:04:00] Speaker A: If you need. If you need anything.
[01:04:02] Speaker B: Yeah.
[01:04:02] Speaker A: You reach out, help us on the way.
[01:04:04] Speaker B: We won't send it.
[01:04:06] Speaker A: Somebody will.
[01:04:07] Speaker B: Sure.
[01:04:07] Speaker A: Somebody put in the comments.
That was pretty much the only update I had on Crowdstrike because, you know, they've been working on resolving the issues and everything. And so that's good. Positive developments. But we do have one more thing that we want to get into that you may have heard about this week. Daniel, I'm glad that you grabbed this one. Nobefore hires fake North Korean, it catch his new employee planting malware.
And this guy, like, circumvented all of their measures to, like, he was able to outsmart basically all these measures just in the hiring process. Right?
[01:04:37] Speaker B: In the hiring process, yes. And it wasn't like, hi, I'm Bob Smith from Ohio. I would love to work at knowbe four if you read the article. This, this individual was very crafty, used every single trick in the book and maybe even more to create a hireable character for the purposes of delivering malware into the know before environments.
So interesting thing about this, this is, this is, let's go upper levels with what we can learn from this know before. And admittedly they said this, they obviously found some weaknesses in our hiring process and that we need to, we need to go with the times because we live in a new world. AI is a thing.
And this individual used AI to create a deep faked character, voice, face, sock puppet accounts, everything they needed. And they utilized things like VPN's because they were north korean hacker that got a job with knowbeforce and was able to. Okay. And they said, hey, send me my laptop for my work purposes. And when they got that laptop, immediately, note before said, immediately after them receiving the laptop, we started seeing hits on malware. They started trying to upload malware access systems. So we saw two things happen with Nobel. Four, one, yep, it's time to update the old hiring processes. When it comes to the security around that and verification of is this person who they say they are, we need to be able to have a little more robustness when it comes to vetting these individuals. Their software did a great job because it was almost immediate that they saw, hey, what's that you got there?
[01:06:34] Speaker A: Well, that's a plus.
[01:06:35] Speaker B: What's that you're doing? Yeah, yeah, this is a plus. So their software did great in verifying and detecting that, hey, you've got, seems like you're trying to upload some malware there and we don't really enjoy that. So you're going to need to stop and we're going to go ahead and stop you, and now we're going to revoke access and everything. You got a perplexed look on your face.
[01:06:53] Speaker A: I'm just wondering if during the hiring process said he slipped past the background checks and everything. So I'm following all that. But usually, even if you're remote, isn't there some kind of an interview as far as like, like audio or video? So how did they not, they did.
[01:07:07] Speaker B: A deepfake if I'm not even on their voice, on voice and maybe even visual. Like he created a deepfake with an AI generated person.
[01:07:18] Speaker A: Dang.
[01:07:18] Speaker B: Yeah, this was no joke. It wasn't like, well, you know, you didn't have like a paper mask on going, you know, some Groucho Marx glasses.
I'm here for the interview, you know.
[01:07:30] Speaker A: Huh.
[01:07:30] Speaker B: That kind of stuff. This was a, this was very sophisticated.
[01:07:34] Speaker A: Okay. That makes more sense than if he was deep faking. I guess I didn't think about his voice. That's kind of scary.
[01:07:39] Speaker B: Yeah. How this works is that a fake worker asked and this is, this is the scam. Fake worker asked to get their workstation sent to an address that is basically an it mule laptop farm.
Then they VPN from where they really physically are. In this case, this was either North Korea or right over the border in China. Because China and North Korea share a border and they work the night shift so that they can seem to be working in the US daytime. So they had all the t's crossed and all the I's dotted. When it came to looking as legitimate as possible, the scam is that they are actually doing the work and getting paid. So it's not that they don't work. They actually do work that they've been tasked to do and they get paid and then they take that money and they give it to their government.
[01:08:24] Speaker A: Yeah.
[01:08:25] Speaker B: Right. And this is to like fund their, their economy basically. Because. Tell me the last thing you bought from North Korea.
[01:08:33] Speaker A: Yeah, that's true.
[01:08:34] Speaker B: Because they don't make anything.
Yeah, they make ICBM's. It's about the thing that they built.
[01:08:41] Speaker A: I think. If I'm not mistaken. I think I found the picture of like the original versus the one that he submitted, hr.
[01:08:47] Speaker B: Oh, yeah.
[01:08:48] Speaker A: Fake, I'm pretty sure. So. Oh, it was a stock photo that he used. So this was the stock photo and then this was the enhanced one that was submitted.
[01:08:56] Speaker B: Excellent.
[01:08:57] Speaker A: So it's pretty, I mean, it's pretty convincing, I guess maybe if you looked real close, but it also could just be like, oh, it's not a very good quality picture, you know? Right.
[01:09:03] Speaker B: I've seen some pretty crap pictures on like, LinkedIn profiles.
[01:09:06] Speaker A: It's not like he's got like an extra finger or something that would give away like weird.
[01:09:09] Speaker B: You leave fingers out of this stupid AI. Right.
[01:09:12] Speaker A: And then even if it was like a 6th finger, be like, hey, I was going that way. You can't.
[01:09:15] Speaker B: Definitely, if you show finger, I will delete your AI.
[01:09:20] Speaker A: Wow.
[01:09:21] Speaker B: I don't want Will Smith eating spaghetti in this.
[01:09:24] Speaker A: Huh?
[01:09:25] Speaker B: Oh, if you haven't seen it, it's like this side by side video of where we were, like, as of last year when it comes to what AI could create, as far as a video goes.
[01:09:34] Speaker A: Yeah.
[01:09:34] Speaker B: And the, the model they used was or the prompt they used was Will Smith eating spaghetti. And then now next to it, they have, like, where we're at now. And it's completely, it's so, yeah, the, it's almost disturbing to see where it was before.
If this is the right, yeah, you can show that.
[01:09:54] Speaker A: I think if it's the right thing. Will Smith eating spaghetti. AI. Okay, I think this is it. This was last year.
[01:09:59] Speaker B: Oh, yeah. Show that, Christian.
Yeah, it's really weird and kind of disturbing.
[01:10:07] Speaker A: They, uh. Yeah, well, that's, I mean, I had.
[01:10:12] Speaker B: The vision if this fooled. No, before then, their hiring press is like, I met this gentleman.
[01:10:19] Speaker A: It just keeps going online. Okay, so then this is 2024, because it just keeps going for like a solid minute.
[01:10:24] Speaker B: Wow.
[01:10:25] Speaker A: It's a lot better. I mean, it's still a little scary.
[01:10:27] Speaker B: So this is where we are as far as AI generation goes now this is age. Oh, right. Still weird.
[01:10:34] Speaker A: Yeah, that's unnerving.
[01:10:34] Speaker B: Can still get weird.
[01:10:36] Speaker A: Great. Now I'm gonna get all these, I'm gonna get all these weird AI generated videos. And my suggestions now still gets weird.
[01:10:43] Speaker B: But, yeah.
[01:10:46] Speaker A: I'm glad that no, before caught this. And it's great that their software is good enough that, like, for them that, that they were able to catch it early. But good thing to be aware of.
[01:10:54] Speaker B: Especially if you're hiring a nobefore CEO warned that the unidentified north korean operatives showed a, quote, high level of sophistication and creating a believable cover identity, exploiting weaknesses in the hiring and background check process, and attempting to establish a foothold within the company. They just weren't, weren't good enough. No, force smashed you. Good job.
[01:11:12] Speaker A: Hopefully they've learned, right? This is how you learn. They'll do some, like, security, some mandated security awareness training now that they'll send out to their employees. I know we got a, like a message company wide that was like, hey, did you see this thing that happened? It was like a teams message. Somebody sent out. Like, look at this fun little news story. Maybe you should be aware if you're hiring. Let's be extra careful.
[01:11:32] Speaker B: So really, really verify. And a lot of that probably comes from the fact that we use third parties to do verification. Right?
[01:11:37] Speaker A: Like contractors and stuff.
[01:11:38] Speaker B: And then those third parties, because they're trying to do it in bulk, they come up with a process.
And that process is not thinking that, oh, this could be an AI generated person.
[01:11:48] Speaker A: Right. It's just doing the best it can.
[01:11:50] Speaker B: And now they're trying to play catch up on what do we do? Now that there are AI generated deepfakes that are out there.
[01:11:56] Speaker A: Yeah, it's a good point.
[01:11:58] Speaker B: Right?
[01:11:58] Speaker A: It's kind of scary getting ahead of the game. Well, that's an interesting one. Probably one that you saw this week. I think that was a pretty, pretty big one. I did see that on a couple different sites. It didn't slip by many people's radars. I think that's pretty much all we've got for news this week.
We are recording a little bit of a different schedule this week because these few weeks have been pretty busy. And next week's episode is probably going to be a little different, too, because I'm about to become an ant. So I'm going to be gone for a little bit next week.
[01:12:22] Speaker B: Small, little bug with antenna. Yeah.
[01:12:24] Speaker A: So I can attend the school for ants from Zoolander. Yeah, that's what I'm doing. I'm traveling.
I'm getting a couple new family members next week, and so I'm very excited. Not like they're. I'm getting a couple. We're generating them artificially.
No, it's. I'm very excited. So I will be gone a couple days next week. So it's like NATO might look a little bit different, but there's still. There will be a technato.
[01:12:45] Speaker B: Dang, it will be a technato.
[01:12:46] Speaker A: And before we even get there, we actually have a webinar this week. All things cybersecurity is back. The day this episode is released, it'll be on 02:00 p.m. eastern time. It's going to be here on the YouTube channel as well as I think on LinkedIn and potentially even on Zoom. So who's our guest again?
[01:13:00] Speaker B: Michelle Khan.
[01:13:01] Speaker A: Michelle Khan. I'm excited.
[01:13:02] Speaker B: We haven't had him on social engineering, Cybermaster.
[01:13:06] Speaker A: I do enjoy a bit of osint. That is always interesting. And we haven't had him on before.
[01:13:09] Speaker B: So it'll be a first time guest. So I'm really excited to have.
[01:13:13] Speaker A: It's going to be a great time. So make sure you join us for that. It's going to be again later today as of the day this episode is released. Other than that, though, thank you, Daniel, for all of your fun, little fun facts that you threw into this episode. And during our break.
[01:13:25] Speaker B: I don't know how many of them.
[01:13:26] Speaker A: Are facts, but yeah, disturbing statement. We'll go that way.
[01:13:30] Speaker B: That's as best as it gets from Daniel.
[01:13:32] Speaker A: And of course, thanks to Christian for supplying our lovely graphics and his beautiful voice, musical stylings over the speakers and of course, thank you for joining us. Leave a like if you enjoyed this episode, subscribe so you never miss an episode in the future and we'll see you next time.
Thanks for watching. If you enjoyed today's show, consider subscribing so you'll never miss a new episode.