Episode Transcript
[00:00:04] Speaker A: You're listening to Technado.
Welcome to another episode of Technado. Just a quick reminder before we jump in that Technado is sponsored by ACI learning, the folks behind it pro. And you can use that discount code, Technato 30, for a discount on your it pro membership. Almost lost it there. I'm Sophie goin, and you might have noticed we are in a different studio than we usually are. We are actually gonna be making a transition here in the next couple of weeks. But for now, we'll just give you a little sneak peek. You're not gonna see any more than what's in these four walls here. And Daniel is joining us remotely again this week, but we're hoping to have him back in the studio next week. For now, though, Daniel, we're just happy to have you here. How you feeling? How's the arm doing?
[00:00:40] Speaker B: Yeah, it's doing really well. Thanks for asking. Uh, healing up quite nicely. Looking forward to the doctor saying, you can take that sling off, because this thing, it's like my own personal iron maiden.
[00:00:53] Speaker A: So you do you got a. You can't really take the sling off or anything. Like, I know you're not supposed to drive and stuff, but, like, it just has to stay on at all times.
[00:01:00] Speaker B: I can take it off. They. They want me to take it off, like, a few times a day to kind of bend at the elbow so your elbow doesn't get froze up. For anybody that's looking at having any kind of shoulder surgery, here's what you can expect. Uh, your elbow will hate you because it stays in that. That crooked position all day long. And you not bending that, the fluids and everything in your arm will kind of build up. So you got to make sure to try to use your hand a lot and just do things to get that elbow and to do. Take it out of the sling a couple times a day and just bend at the elbow a little bit and work that out, because, man, that. That joker's been murder.
[00:01:36] Speaker A: All right, well, it's good to know I'm not planning on having shoulder surgery, but it's not the kind of thing you usually plan on, so maybe it's.
[00:01:43] Speaker B: Yeah, I didn't. I didn't wake up one morning, go, you know what? I want shoulder surgery.
[00:01:47] Speaker A: I got time. It's a holiday weekend. Let's do it.
[00:01:49] Speaker B: Yeah. You know, I treat myself.
[00:01:51] Speaker A: Yeah, why not? Well, I hope you did enjoy your Memorial day weekend. Still hanging out at home with the family, and hopefully all of you out there enjoyed as well. If you had that three day weekend. We've got quite a bit of stuff to go into, though, because over that three day weekend we had some things happening. So we're going to start with something that. This is not the first time we've talked about this, even this month on Technato. So it probably could be deja news, but we're not going to do that. Chrome browser has yet another zero day. This is the fourth one discovered just in this month, just in May 2024, and I believe it's the 8th so far since January. So they're not having a great time. They have updated like there, there is a way to mitigate this if you update to the latest version, so get that out of the way. They do have a fix for this, but it is a high severity security flaw. So, Daniel, maybe you had some, some luck with this. I couldn't find a, a CVSS score for this yet. I don't think. I don't think they have one like on the NVD.
[00:02:44] Speaker B: Look for one. I would assume that if this is a patch now, it's going to be one of those criticals.
[00:02:50] Speaker A: Yeah.
[00:02:50] Speaker B: So, yeah, I mean, hey, let's start off with the fact that can, what are we in May, can, may calm the hell down already with the vulnerabilities for Chrome. I mean, it's obviously one of the, if not the most used web browser on, in the market. You know that Chrome and Chromium browsers. So all these things are going to affect them. It's just man, and one thing after another, I don't know what was going on over at Google with the, with the Chrome code, but man, they, they were having a bad day. Maybe someone was going through a breakup or something. Who knows? They're, they're coding. Their coding was not up to the Google par as it normally would. And because every other day we're getting all these wonderful, horrible exploits that keep coming out. So, no, I haven't seen whether or not they have what the CVSS scores are. This is just one of those things where it's like, wow, again. Here we go again. I mean, next week, tune in next week to see what the next chrome vulnerability will be.
I'm wondering if they're just walking through the code or just areas and aspects of the chrome browser, these security researchers that are finding these flaws and they've hit a honey hole of bad coding and bad practice that is revealing all these, because the good news is they're finding them, they're getting, getting unpatched. And that's, that's what matters.
[00:04:14] Speaker A: That's just a good point. I guess. If anybody is going to discover a vulnerability like this, we want it to be a security researcher, right? If, you know, we want this stuff to come out so that we can fix it before somebody that maybe doesn't have so good of intentions takes advantage of it. Now, this one in particular, like I said, it's the fourth one just this month, just in May. It says it's a type confusion bug. And I was not familiar with this type of bug, but according to the hacker news, type confusion vulnerabilities occur when a program attempts to access a resource with an incompatible type. Uh, can allow threat actors to perform things like out of bounds memory access. It can cause a crash. So a lot of things that can happen as a result of this. Uh, Daniel, we're, I'm. This is probably a stupid question. Were you familiar with this type of vulnerability? I. I had just never heard this term used before.
[00:04:56] Speaker B: Yeah. So ultimately, at the end of the day, when, when a security researcher is looking for ways to exploit software, a lot of times what they, they try to find is a way to crash the program. Can I give it something that it didn't expect? And usually it comes in the form of user input. So if I have an area where the user can input or affect input into the system, because the system needs to know things, it doesn't know that offhand, so it has to ask for it or give the opportunity for the end user to give it that information. And then it incorporates that into the backend code that runs a query or does whatever it's supposed to do with type confusion. That's where I go, okay, so you have different data types. If I'm remembering this correctly, this is kind of how it works. You have different data types. Let's say it's looking for a type integer, and I give it a string. It doesn't know what to do with the string, and if it's coded incorrectly, it'll throw an error.
Everything should be thought about and meticulously expected and prepared for. So the developer that wrote this code should have thought about, well, what if someone, for whatever reason, stupid or otherwise, gave me the wrong data type? Well, instead of erroring and crashing the system, possibly it should just go, hey, you gave me the wrong data type. That's invalid. It should be validating that input before it hits the query. So that wasn't happening, apparently. And if I give it that, it goes, oh, that's the wrong data type, and you get an error anytime you've done a compilation. You get a compilation error. It says, hey, you got mismatched data types or expecting integer, but got string. You mess around with this stuff a little bit and you'll see that. But if I can cause a crash, well hey, out of the gate I've got a security issue because now I've got a denial of service. So at the very least I can have that.
Sometimes it will allow you, it'll start to do things like spilling into memory in different weird areas or give you access to that memory. And if I have access to that memory, well then maybe I can supply it with code for said memory. And that's where you start to get like remote code execution.
[00:07:04] Speaker A: Ooh, okay, that's always a scary term. They did give us a lovely little list here of all of the zero days that we've seen so far in 2024 for Google, for Google Chrome. So this being the most recent one and these are the other seven, I guess. So if you missed these, here's a list for you and some links to check those out. But as I said before, there is a fix for this if you upgrade to the latest version of Chrome for Windows and macOS as well as Linux. And they also recommend if you're using some kind of a chromium based browser like edge. I'm not, I don't know why you'd use it by choice, but I'm not here to judge you, but also advised to apply those fixes when they become available. So hopefully this is not going to be anything that affects too many people. They did say something about there, there is an exploit for this that exists in the wild, but they didn't give any additional technical details about the flaw. So maybe we'll see more come out about this in the future. Hopefully we don't see any more zero days this month at least.
[00:07:58] Speaker B: I guess that would be great.
[00:07:59] Speaker A: It's what, June? I guess this weekend. So we've gotten there. We're towards the end of the month. No more zero days this month. We'll start to.
[00:08:06] Speaker B: This is the word in the halfway point to the year now.
[00:08:09] Speaker A: Yeah, we'll start the tally for June. Zero days for June. We'll see if we can, if they can beat their record, but we'll jump in.
[00:08:15] Speaker B: We've had zero days since seeing zero days.
[00:08:18] Speaker A: Yeah, that could be, it could probably be a segment if we wanted to make it one. We could have a little animation. We'll jump into this next article here. Initially upon reading this, I was misled. I'll admit hackers fish finance orgs using trojanized minesweeper clone. I thought there was going to be some kind of malware involved. I don't believe there is. What they're doing is they're using the minesweeper game to kind of disguise it. But the program they're using is a legitimate program. It's a, it's something called Super Ops RMM, and it is a legitimate remote management software. So this is a real piece of software that you can use for, for normal everyday things as just being misappropriated, misused by, by bad people, by bad guys. But what's interesting, I think, is that the, I guess the code, like the minesweeper code, is great for disguising whatever it is that they're trying to get in there. So they include some attack details and stuff. I don't know, I just thought this was kind of neat. I mean, if it wasn't being used for bad stuff, it would be neat in theory.
It's unfortunate.
[00:09:15] Speaker B: Are you identifying with your attackers now? Have you been stockholmed by the.
[00:09:18] Speaker A: I've been going to therapy.
[00:09:19] Speaker B: The ransomware groups, they're not bad people. Right.
[00:09:25] Speaker A: I'm sympathizing. I'm pulling out my doctor Melfi vocabulary here's gotcha.
[00:09:30] Speaker B: Gotcha.
[00:09:31] Speaker A: But is this something that, I mean, I've, I've not been in the game as long as some of our viewers have and as long as you have. This to me, is kind of a novel idea, but I'm sure this is not something like this has probably been done before. Hey, download. You know, because all the code is legitimate. There's nothing that's like, oh, well, this is malware. Oh, this is, it's ransomware. Whatever. It's just being used for bad stuff. It's sad, you know?
[00:09:55] Speaker B: Yeah. And it's super smart to kind of bolt things together. And this is really old school, actually, as far as the mentality goes. A lot of the best hacker tools of your started off as system administration tools, right? So if, if it gives you access or allows you to modify things and do the things you want to do, well, that's just as useful to an attacker as it is an administrator, right. It's all about how it's being used at that point. So smart on them to, to use something that's not going to get flagged as malware if they can just bolt these things together, obfuscate the fact that one thing is a part of the other by hiding that code inside of other code packaging in there. Actually Sophia, it's funny you don't remember this because you probably didn't realize what was going on. But when I think we shot ceh, I did this very thing where I used a minesweeper game and I packed in a metasploit, meterpreter, the code for a meterpreter session into that game. So when you installed said game, we gained meterpreter shell using metasploit?
[00:11:06] Speaker A: I do.
[00:11:07] Speaker B: Vaguely, yeah. Yeah. So this is not, this is not a new thing. It's just a useful thing. And anybody can go out to GitHub and find some open source game or other tool or whatever the case is. It doesn't have to be a game. It could be just anything. And then go, oh, okay, this is written in python, or this is written in go. This is written in rust. And if I'm handy with those languages, then I can just, because it's open source, I can just add my malicious code to the game, right. And then compile it and make it available on a website or wherever, you know, send it to people, hey, there's a cool game. Check it out. Right? That's the old idea of a Trojan horse, is that there is something legitimate that has something malicious baked inside of it, right? So this is, this is not that novel as far as the idea is just what they're, the tools they're using to make it happen is, is different.
[00:12:02] Speaker A: And in this case, it's not even like, here, download this game. Here's a link. What's, what's being sent as an email that looks like it's coming from a medical center. And it says, oh, here's a web archive of your medical documents, personal web archive. So then if you download the file, that's, there's a Dropbox link that's provided, and if you download that file, that's when there's code from a python clone of a minesweeper game that gets downloaded and then the malicious code gets downloaded along with that. So you don't even realize you're downloading the game. I guess it's just kind of, it says it's like a cover so that the machine doesn't know.
[00:12:33] Speaker B: So I think it's a clone of the Microsoft minesweeper game. A python clone written in Python. Right. So this is how it kind of like. So the, the phishing link is to, for medical documents. So, okay, that looks legit. I'm going to click down with these medical documents.
Under the hood, it installs a clone of the Microsoft minesweeper game, which also has the RMM.
I think it's a loader for the RMM, as well. And then now we have c two. So it's a. It's a busy little chain of events that's going on there, but an interesting one that the fact that they would.
I'm guessing you won't see. I'd love to see that happen in real time, and I'd love to have a honey pot that I could install this or just a vm and get my hands on this specific thing and watch it install and see how that looks, because that should be pretty interesting that this is happening on the.
I mean, is there an installer? Do you see anything? I'm always interested in seeing.
Were there any telltale signs that something malicious may have happened? But a lot of people just don't pay attention, even if there are, and so it gets glossed over. Yeah, it's always just interesting to me.
[00:13:48] Speaker A: I did try to go look at the report that came with this, and when I clicked on the link to take me to the page, it was all in a different language, Ukrainian.
[00:13:56] Speaker B: You don't. You don't read Cyrillic.
[00:13:59] Speaker A: I tried to translate it to English, and it says here that it's translated to English. And as you can see, it's very clearly not. So I don't know what's going on with my browser, but I got to go in and figure that out. So I did try to go in and look, because I said there was some indicators of compromise listed here. I'd have to run it through Google Translate, I guess. So if you want to take a look at that and try to parse that, feel free. We're going to include links to all these stories in the description. So if you're watching on YouTube, you'll be able to look at the description and click on those if you want to read more about these. But for now, we'll move on to our next one. This is a favorite segment of mine. Who got pwned? Looks like you're about to get pwned.
[00:14:32] Speaker B: Fatality.
[00:14:33] Speaker A: Yeah, you're about to get pwned. As it turns out, Bayer and twelve other major drug companies were caught up in the. I believe that's Sankora data loss. I've only ever seen it written. I've never heard it said, so I'm guessing there. So there were more than a dozen big pharmaceutical suppliers that are starting to notify people. Hey, by the way, your data was stolen. Now, I personally think that these names look a little bit fake. They're real. Companies, I just think they look kind of silly. And I know this is no laughing matter, but it was this big firm that partnered with a bunch of big pharma dealers, including GlaxoSmithKline, Novartis, Genentech, Bayer Regeneron, and Bristol Myers Squibb. Those are either Star wars characters or names of fantastic Mister Fox characters by Roald Dahl. So, you know, I just, little, little humor there. I just think they're funny names. But obviously, this is no laughing matter. Personal information affected first and last name, address, date of birth. The big thing here, I think, is things like medications, prescriptions, health diagnosis, personal medical information. That is kind of the scary part. So maybe I'm wrong. Maybe I'm off on that. If my first and last name is exposed, my address, that's stuff you could probably find if you googled it hard enough. Right. But things like my personal health diagnosis, my prescriptions, I don't want that stuff out there. It's my personal information. Um, so I don't know, a little scary, scary stuff. What do you think, Daniel? What's your, what's your take on this?
[00:15:56] Speaker B: So, yeah, the, uh, the big danger here is when it comes to personal health information, or phi, it, it's a real problem because it gets used a lot for, uh, scamming people. Right. Identity theft and that kind of stuff. So, hey, I'm, I'm with your health group or contacting you in whatever form to talk to you about a legitimate ailment that you have had or medication that youre on, and then they can use that. Oh, were going to send you a link. Were going to have you install this app or going to blah, blah, blah. Then youre down the road, and now theyre pilfering your pockets and stealing your money. So a lot of that gets done this way. So thats unfortunate. Its another unfortunate thing is to think of.
We see that this was a $250 billion firm.
Thats a lot of money. So theyre no small potatoes. The names of these corporations that are a part of this are huge. They are massive, massive corporations.
If they cant do security rights with their basically limitless resources, who can?
Uh, depressing almost, to see, like, either a, they're not trying to do enough right about it because they don't care. So then that's quite possible. Like, that could be the case. We just do enough, check the box, and we'll call it security day, and we'll deal with it if it ever comes to a breach. Well, now they're dealing with it because there is a breach, or they're trying really hard, and it's still ineffective. And I just get really interested into thinking, like, I see a lot of people post things like LinkedIn about the 14 year old hackers out there, that they do not gain access into your system in the same way your pen testers and red teamers do.
So I understand there is some overlap, but should we be modeling our red teams and our pen tests more after our attackers, or what can we do to, like, make that a little more, uh, you know, have a little more parity with that so that we can be better prepared for these things if that is the case. If the case is, is that we're just not being effective now, again, going back to, it could be the case that they are just not doing enough because they prefer the money in their pocket over spending it on security stuff. And that is one of the old security tropes out there, is that. Yeah, well, security gets just enough until a breach happens, and then they turn the money faucet on and go, hey, we can't have that happen again. And I'm sure there are many people in that organization that work for their security teams going, we've been trying to tell you that, and it takes this for you to listen.
So maybe it's a little bit of both. Maybe there's some other options as well. But it just. Just always strikes me to see these huge corporations such as this, and then they get breached, and it's like, wow, how the heck did that happen? Well, I know how it happened, but how is it that we. How can we fix that problem? It's a really interesting problem. It's a really difficult problem, and we've been working on it for a long time, and we don't seem much closer to a real solution. We've. We've come up with some good ones, but they haven't been as effective as we had hoped. So, hopefully, this is going to be the impetus to something more useful and more secure.
[00:19:24] Speaker A: Sounds to me like the solution to that problem is hire some more 14 year old hackers on your pen testing or your red team, because clearly they have some, you know, they've got some insights or some ways that they're doing things that we're not covering, but they don't have.
[00:19:35] Speaker B: They don't have, you know, eight years of experience as a pen tester, so they can't get their job.
[00:19:40] Speaker A: If you're old enough to work at Publix, you're old enough to work as a pen tester. All right. We don't have. Our standards in Florida here are low. All right. You can. You can do whatever you want. Plus, hey, that's right.
[00:19:49] Speaker B: They're packing groceries and pen testing.
[00:19:52] Speaker A: They're on the same level. Right. You're part time. They don't have to pay them as much. Right. Because they're only working half the. I think this is a win win.
I'm not an advocate for child labor. I'm just saying, hey.
[00:20:02] Speaker B: Yeah.
[00:20:02] Speaker A: Gonna give them some job experience.
[00:20:04] Speaker B: Sophia's true colors come out.
[00:20:05] Speaker A: Yeah. I have to hedge my kids to work. Somebody's gonna comment and be like, oh, real. I'm gonna get hate for that in some way. But they do shop of kids. Yeah. Right.
I have to put a disclaimer at the bottom, like a little asterisk. I do not support or condone child labor. Of course they do. They put a little asterisk here in their. In their release about this information. No evidence that any of this information has been or will be publicly disclosed or that it's been misused. But this is. It was a data loss situation. So they're noticing that. That all this data has been lost as a result of that earlier breach. And this is the kind of thing that I feel like this is always what is said initially. They just gotta. They gotta cover themselves. Well, we don't. It doesn't look like it's really that bad. And your information is not being misused. And then in three weeks, it's like, oh, by the way, your Social Security number's out. Sorry, that sucks.
[00:20:51] Speaker B: Right? Step one, deny, deny, deny.
[00:20:54] Speaker A: Yeah, absolutely.
[00:20:55] Speaker B: Other than the fact don't deny we had a breach. Because you can't legally. Yeah, but deny that anything bad had happened during said breach. You know, they were just the sweetest attackers. Just real peaches. They just came in, they cleaned up a few things.
[00:21:07] Speaker A: Yeah.
[00:21:08] Speaker B: You know, gave us some advice about security, and they went on their merry way. We were very happy to have. And then it's like, what do you mean? They. They stole a bunch of data and they're supposed to get on the dark web for money. Ah, they weren't as nice as they said they were. I know, I know.
[00:21:22] Speaker A: They were such nice guys. We were going to get coffee on Friday. I can't believe this.
In this initial pilates. Yeah, we had an orange theory date. Come on. This. This initial data loss happened back in February. So who's to say more information won't come out about this? We'll have to wait and see. Hopefully not. Hopefully this is the end of it, but it almost never is. I think maybe we got time for a couple more, maybe, before we take a break. So we'll jump into this next one. A hacker has defaced a spyware app site in, dumped the database and source code. And another interesting part of this story. The spyware app was something called, I believe, PC Tattletale spyware, and also referred to as stalkerware in this article, interchangeably. But this spyware application was found on the booking systems of several hotels in the United States. And that's kind of how they realized that they were leaking these archives with database and source code. So this is something that's intended to be an employee and child monitoring software. As a former child, I can tell you, if my mom had installed this on my computer, I might have been like, oh, come on, man. I'm trying to play webkins, and you're monitoring my activity. But I can see why you might want to use something like this. But it kind of shows, I guess, the risk of using something like this if somebody nefarious is able to gain access to it. And in this case, the database of the application itself. That's a lot of information. A lot of records, screen recordings, whatever the case may be, whatever it's collecting, that now is in the wrong hands. So I guess it's a risk you take by using something like this. Have you ever. You ever looked into any software like this, Daniel?
[00:22:52] Speaker B: Or like, from my kids.
[00:22:55] Speaker A: Right. Yeah. I know your kids are probably young at this point to be on the Internet, but is this something you think you might ever use?
[00:23:01] Speaker B: Um. I mean, maybe, uh. Cause, you know, you gotta keep your kids safe.
[00:23:07] Speaker A: Yeah.
[00:23:07] Speaker B: And. And I don't know. I'd have to look into that. I'm more of an advocate of just not letting them on the Internet.
[00:23:13] Speaker A: Yeah. Why take the chance?
[00:23:15] Speaker B: Very, very. Just open and supervised. I don't necessarily need a tattletale software because. Yeah, they're getting dumb phones. No, no kid of mine is getting a smartphone.
That's just. That's not happening. You can talk, you can text, and you can get gps. What more do you need? Gotta be honest with you. But the interesting part of this was that this was on hotel booking systems.
So that, what the heck are they watching on a hotel booking system? Was this installed as stalker wear by an attacker and another attacker found, like, I'm a little confused on this article, to be honest with you.
[00:23:57] Speaker A: I think, and I may be misinterpreting this, I think it was intended to be an employee monitoring software, like for the employees of the hotel. I could be wrong. I could be reading that wrong. But of course, it ended up being that somebody was able to get access to it that they shouldn't. It was leaking guest details and customer information that was captured from the hotel's check in systems. I mean, if you can record everything that's happening on a computer, and this is a hotel computer that's being used to access guest information and stuff, then, I mean, follow the bouncing ball.
[00:24:28] Speaker B: Yeah.
[00:24:28] Speaker A: That's information that's going to be tracked and saved and stored and all that stuff. So there was a serious vulnerability discovered in the PC tattletale API. So any attacker could obtain the most recent screen capture recorded from any device on which this app is installed. I believe the creator of the app, the founder of this app, was trying to go in and fix it because the website was defaced too. It wasn't just the app that was accessed. They went on and defaced the website for this application. And I guess he was able to get a screen grab or a screen recording of this guy attempting to then go back in and fix the website. So he's almost like, haha, I can see you trying to fix it. It's not going to work.
So just kind of, kind of being a turd about it, honestly.
So I guess it's. I guess, yeah. When you download something like this and you're recording all this information, you take the risk that if somebody gets access to it, that stuff's gonna, they're gonna.
[00:25:19] Speaker B: Have mine and all this stuff. You know, my confusion really comes in the fact that they found it on a Wyndham hotel booking system. So they found software on Wyndham Hotel. So Wyndham is using it. So that, that's where I'm confused. Is Wyndham using this knowingly? Or were, you know what I mean? And like, what? So why go attack PC tattletale? Wyndham is the one that's using it.
Why not attack them? If they're the ones using, if they're not, if they've been hacked and then someone installed PC tattletale, then that makes more sense. And that's where I'm a little confused. Maybe I just have to more carefully read this article.
[00:25:54] Speaker A: But yeah, I don't, I don't know that they get, they're not very clear about that, about whether it was installed intentionally, that Windham was like, yeah, this is software. We have to make sure our employees aren't doing stuff they shouldn't be doing. Or if this was like, oh, PC Tattletale. Right. Yeah, we knew about that, and, and they actually didn't.
[00:26:12] Speaker B: So, because it says right here, unfortunately, PC Tattletale have ignored Zach in its attempts to contact him to fix the issue. What's the issue? The issue was it's leaking screen captures.
[00:26:23] Speaker A: Yeah, I guess it was an idor, something in there, in the PC title API was the issue that I guess the person that discovered this issue was trying to report.
And this is not the first time that this, they call it a stalkerware. Stalkerware app has had an issue. A couple years ago, it was found leaking real time screenshots from Android phones. So it's not the first security issue that the application has had.
[00:26:46] Speaker B: It looks like it's in good company. Every software has had security issues.
[00:26:52] Speaker A: That's true.
[00:26:53] Speaker B: And they've been, they've been disclosed and, you know, patches come out and so on and so forth. Like I said, this, this, this is just a weirdly worded article.
You're better than that bleeping computer. Yeah, you didn't make this very clear. This is Bush league, right? It just seems a little muddled. It's hard to follow.
And so here's my question. Did a security researcher discover this vulnerability and then disclose it to PC Tattletale? And then PC tattletale did nothing about it, and therefore that security researcher decided to deface the PC title tale website?
[00:27:31] Speaker A: I think that's exactly what happened. I think they tried to reach out, do responsible disclosure. The website or the people running the application. The folks in charge did not ever respond. They didn't fix the flaw. They didn't do anything about it.
[00:27:43] Speaker B: So that's illegal.
[00:27:45] Speaker A: So the researcher didn't share a ton of information about it. He tried to limit what it, what he shared, but then somebody took as a challenge and was like, oh, okay, let me see what I can do. And that's when they defaced the website, leaked a bunch of archives with source code and data from their databases. So a little scary to say.
[00:28:02] Speaker B: Who the hell is Zack, right?
[00:28:06] Speaker A: It is written a little funny.
[00:28:08] Speaker B: Is Zach, like, the first time? This is, unfortunately, PC title tale have ignored Zack. Who the hell is Zach? I haven't seen Zack's name. I see Joe Koskia.
[00:28:17] Speaker A: So the daggle. Who's it looks like Eric Daggle's the main guy. That's okay.
[00:28:25] Speaker B: I saw Eric Daigle, I guess. Don't see a Zack in here. Who the hell is Zack?
[00:28:29] Speaker A: Yeah, he mentioned Zack in a quote, but they don't clarify who that is. So his buddies. Maybe he's helping him out. I don't know.
[00:28:37] Speaker B: Doesn't make any sense.
[00:28:38] Speaker A: Yeah, this camp, this is probably not the. At this point, now that it's. It's been a few days, there's gotta be other write ups on this, so I'll have to look and see if I can't find something that makes it a little bit more clear.
But, yeah, as of now, the person.
Yeah. And since this was reported, the person that breached the site shared a video of what they claim is the website order trying to restore the site. So that was when they were like, oh, look, they're trying to fix it. It's not gonna work. So kind of being a turd about it. So the website's now been taken offline. Uh, they're. They're trying to do some damage control, I think, here. So I'll look into this to see if I can't find something that is a little bit more clear. No. Hate to bleep. A computer. Usually love their stuff. This article is just a little tough to follow. Uh, I'll have to see if I can find something a little bit better.
[00:29:22] Speaker B: What?
[00:29:22] Speaker A: What say you, Daniel?
[00:29:24] Speaker B: I can't wait. I love to have a more clear and concise piece of writing that tells me what the heck is going on with this.
[00:29:31] Speaker A: All right, well, in the meantime, then, we're. We're coming up on half an hour here, so this might be a good place to take a break. Give me some time to maybe find a little bit more of a clear source on this. So. So we'll go and do that. A quick break here, but we still have several articles to go, so don't go away. We'll be right back with more on Technado. Tired of trying to schedule your team's time around in person learning? Isn't it a bummer to spend thousands of dollars on travel for professional development? What if we said you can save money and time and still provide your team with the best training possible? The answer to your woes is live online training from ACI learning. With live online training, we provide our top in person courses in private online instructor led formats. You get to provide professional development in a manner that fits today's expectations. Entertaining, convenient, and effective. Our exam aligned courses inspire the full potential of your team. Visit virtual instructor led training at ACI learning for more info.
Welcome back to Technado. Thanks for sticking with us through that break. We took some time, did a little fact checking, so we've got some. Some more information on a couple of the articles we talked about earlier. Daniel, I might. I might toss it to you first, because I know one of the first things we talked about was that new zero day exploit that Google has released some information about. And we talked a little bit about type confusion vulnerabilities. And I understand you wanted to clarify something.
[00:30:46] Speaker B: Oh, yeah. So I was telling you, your brain will go, hey, you know the thing you said, I'm not 100% on that. You might want to check yourself. And I was talking about the type confusion vulnerability. I mentioned it was data types, but it's actually objects.
If I was reading here, it says that.
Where is this? I should have had this up. When the program accesses the resource using incompatible type, this could trigger a local error because the resource does not have the expected properties. And languages without memory safety, such as c and c type confusion can lead to out of bounds memory access. So basically what they're talking about, it's not a data type confusion, it's an object. So I can make an object, and that object could be the wrong type of object, and that can cause the confusion. Therefore. Yeah, I just wanted to clarify that because it was like everything else was right, but there's something telling me that it wasn't data types and I was right. It was. It's objects.
[00:31:38] Speaker A: Okay. We love that. Still, small voice.
[00:31:41] Speaker B: It's fun. Like something, you know, when you're, when you're just talking, you know, you say things.
[00:31:46] Speaker A: Yeah. Oh, all the time. I yap so often. I'm sure I've said plenty of things that later I'm like, I don't know about that one. And of course, one of the, one of those things, the article that we were looking at for that story on the spyware that was found on those hotel computers, maybe not. Maybe not the best choice to cover that. So that's on me. But I did find a different source that gives a little bit more information.
This one's from TechCrunch. Same story. It is not known who planted the app or how the app was planted. So that's kind of what we were trying to figure out was how did this app get on the computers? We don't know yet, and that's why it wasn't included in the article. They just didn't clarify it or make mention of it specifically. So we don't know if employees were phished and tricked into installing it, if it was supposed to monitor employee behavior, and it was just misused.
But another thing that this article included that I thought was interesting. It's a little bit of a different, you know, kind of a side street. But this server app, PC Tattletale, offers a service called we do it for you. So, hey, we can install the spyware on your behalf. Just all you got to do is pick a time. You'll get an email with instructions for us to access their computer. No traces left behind. All tracks covered that. To me, it does mention that this type of software operates in kind of a murky legal space. I don't know how that's legal at all. Like, if somebody, even if it was like my mom, if she was like, yeah, I'm just going to put something on, on my adult daughter's computer, and I didn't know that it was. How is that not illegal?
[00:33:11] Speaker B: How it's going to depend on who owns the device. If she owns the device, she can do anything she wants to it.
[00:33:18] Speaker A: Okay, I see. So I guess, yeah, if you're talking about your kids or your employees, then at that point it's like, well, but yeah, I guess using it against people without their knowledge and consent is unlawful. But if it's on your device, if you read.
[00:33:32] Speaker B: If you read our ACI employee handbook as a whole section in there about how they have the right to monitor you.
[00:33:38] Speaker A: Right.
[00:33:39] Speaker B: And monitor your devices that they issue you because you don't own them. They're your. They're theirs.
[00:33:44] Speaker A: Yeah. I kind of assume they're doing that anyway. Like, as an employee, I just kind of assume. I'm like, you want to look at my technical articles? Okay, go for it.
[00:33:52] Speaker B: That is a good assumption.
[00:33:53] Speaker A: Yeah, yeah, yeah. The most that I'm doing on here is, oh, look, Techcrunch. You know, so. Yeah, but, yeah, that just, that service that they offer, I was like, oh, that gives me the heebie jeebies. I don't like that. Just makes me, there's just something about it that I'm like, that doesn't seem right. So who knows? It could be that that's maybe whoever it was that's responsible for this attack utilized that kind of a service and was able to install stuff without the employees or managers of these hotels ever knowing that it was. That it was going on. So maybe some. It's possible. Anything's possible. At this point, we don't have a ton of information, so maybe some more stuff will come out about this in the future about how that app got onto those computers. Who knows? We might see this again on another.
[00:34:32] Speaker B: I still want to know who Zach is.
[00:34:36] Speaker A: Still not clear on that.
[00:34:38] Speaker B: Throw a name out there.
[00:34:39] Speaker A: Well, to see if we could find him, give him some credit for his work. Him and him and Eric, Eric Dagle are apparently responsible for, for figuring this out. So we'll have to see if we can find some information on him. But we have a couple more segments we want to get through here on Technato. This next one's another favorite behind bars.
[00:34:59] Speaker B: Break the law and you'll go to jail.
[00:35:03] Speaker A: Literally. So true. We actually have a couple articles that kind of fall under this heading. This time, a man behind the deep fake Biden Robicol was indicted on felony charges and faces a $6 million fine. That's no small potatoes. That's a lot of money. I don't think. Yeah, I don't think most people probably see that money in their lifetime. So not sure how he's going to cover that. But this guy, Stephen Kramer, was responsible for using an AI generated voice cloning technology to impersonate the president and using caller id spoofing. So it's not like he just made a video and uploaded it and he's in trouble for that. He like, went to pretty, pretty extreme lengths to create this, this voice clone of the, of the president. He wrote a script for the call telling people not to vote in this primary, paid somebody to use artificial intelligence to record that script, and then hired a telemarketing firm to play the recording to, like, thousands of people over the phone. So this was a pretty drawn out operation. He's not just getting fined or getting arrested for, for making a video. Like, he did a lot of stuff that led to this point.
It would be funny if it wasn't scary, you know, like the fact that he was able to pull this off. Uh, it's just, it's freaky to think about. That's my opinion. What do you think?
[00:36:16] Speaker B: Yeah, uh, definitely. We live in a frightening, frightening world at this point. With the advent of these different AI's and what they can do.
Honestly, I didn't think it was that difficult. And as soon as these different AI tools came out, I was like, oh, yeah, this is going to be a problem, right? Because it's not that difficult to go to eleven labs and create Joe Biden, Donald Trump, Obama. You name the celebrity or politician and write a script that says, I'm Donald Trump and I'm telling you to go buy cats for thanksgiving or whatever. Make them say whatever you want them to say. And, and to make that, and again, that's not illegal to do. It's more fun and has made some highly entertaining content out there on the Internet. I'm, I'm happy for that side of it because it has made me chuckle many, many times. But this guy was like, well, what's stopping me from just using it politically? And this is, this is, you know, kind of gets into the whole legal aspect of the thing. Whereas you created something for the intents and purposes of a, maybe it falls under slander, libel laws, impersonation of public officials. Right now you're meaning to trick people for a purpose that is for elections and so on and so forth.
Really, for me, this article, the interesting stuff that I kind of pulled away from it is, hey, I didnt think it was that difficult. So you paid a guy $150 to read a script to the AI and have it make it sound like Joe Biden. And then you took that recording and gave that to a PR firm and said, hey, robocall this. And they went, okay, well, thats what we do. We robocall you pay us money, we robocall for you. And he went, cool, thank you. And they did. And because they have a high level of attestation as far as their color id goes, that's less likely to be seen as, or blocked as spam or malicious. So it made its way. Now, the fact that they charged him 6 million, but they're looking to fine him $6 million, that's what's interesting to me. Really. The kind of the fun, one of the bigger fun things that came out of this article. Why not $6 trillion?
Like, this guy paid $500 to do this. I don't think he's got the millions to get that fine paid off.
[00:38:47] Speaker A: Yeah.
[00:38:48] Speaker B: And why a fine? Why wouldn't this be some sort of, like, I mean, it is criminal charge. So is that the only thing they can do is fine him? Is that the legal extent of what they can do? They had him up on federal charges, so. Or felon. I'm sorry, felony charges. I don't know if they're federal charges or not, but did say there were like nine felony charges.
So is that not enough to have a man do a couple of months in the, the county hotel or the state hotel or whatever the case is?
Um, why $6 million?
[00:39:18] Speaker A: It's a good question.
[00:39:19] Speaker B: Maybe because it's FCC like.
[00:39:21] Speaker A: Yeah, why would they go that route? Is like just a very large fine as opposed to actual hard time. Because.
[00:39:27] Speaker B: Yeah, I'm guessing it's because it's through the FCC.
[00:39:30] Speaker A: Could be.
[00:39:30] Speaker B: And the FCC doesn't have, like, prosecution power. May I don't know.
[00:39:34] Speaker A: Yeah. But he was charged with 13 felony counts of voter suppression, 13 misdemeanor counts of impersonation of a candidate. So he definitely has several, several counts here that he's being charged with it. Maybe it's something related to the fact that this. If this is finalized, if this. If this goes through, it'll be the first ever enforcement action related to spoofed deep fake robocalls in the US. So maybe because they've not really ever seen anything like this before or had to charge for something like this before, you know, because.
[00:40:01] Speaker B: Right. They gotta set precedent.
[00:40:03] Speaker A: Right, exactly. So, yeah, maybe that's the case. They're going extreme.
[00:40:07] Speaker B: How do you feel about living in a world where you can't trust any media? I mean, have you seen the images that, that AI can create and how realistic they look down?
Yeah. Leave. Leave hands and certain things out of the. Out of the equation.
[00:40:23] Speaker A: Yeah.
[00:40:24] Speaker B: And you've got a very realistic looking thing or just doing deepfakes, the whole deep fake idea.
[00:40:31] Speaker A: There have been a couple times where it's. It's got me, even just for a second, there will be a video of some celebrity, and they can make it so that it looks like their mouth is moving differently. They've obviously gotten out the ability to create, like, a voice clone or whatever, so it'll look like it's. It's some celebrity doing an interview. And it's only after looking at it for a second and focusing, like, on the mouth that I'm like, okay, this is, you know, somebody. Somebody faked this. But it's convincing, if only for a moment. Sometimes a moment's all, you know?
[00:40:54] Speaker B: And that's. You nailed it. That's when it comes to things like political stuff or. Or whatever. Maybe I'm trying to scam you out of money. You just need. I just need you to believe me for that moment. It's fine. If you find out a day later that that wasn't real, damage is already done.
[00:41:12] Speaker A: Yeah.
[00:41:13] Speaker B: Right. So, I mean, it's just scary as hell to think, like, I want some sort of validation.
I'm the kind of person, I really enjoy getting all the information and then kind of working it out and figuring it out and then going, okay, this is my conclusion. And then listening to other people's conclusions to see, oh, maybe I missed something on that, or, oh, you're not thinking about this. You know what I mean? And now that makes that difficult to be able to go, well, here's the evidence. Here's the receipts for why I believe what I believe how I came to my conclusion and someone goes, oh, did you know that's a deep fake? You go, oh, goodness gracious. Right. I hate for people to get into that because we make our decisions based off of the information we're given from sources. If we can't trust those sources, if we can't trust the information that's coming to us, then you can't make a decision.
It's very, it's very difficult. Right. So we got to have some sort of way to validate our sources at this point with something we can trust, that we can all trust.
[00:42:18] Speaker A: Yeah, it could have something to do. The fact that it was such a high amount that he's being fined because it was, he played the recording to more than 5000 voters over the phone. So if they take each of those as an individual count of him doing something, in theory, there was somebody in the comments that was talking about it. And if you were to find them, the typical amount for causing this problem and committing this crime, and you took each of those as an individual count, 6 million might be him getting off easy. So it'll be interesting to see if this, if this is getting off easy. Could be. I mean, yeah, it's a, it's an interesting point raised in the, in the comment section here. This is from the register. So if you want to go check out those comments, you'll be able to click the link in the description and check that out. But it'll be interesting to see.
[00:43:00] Speaker B: My thought was like 6 million or 6 trillion, this guy doesn't have the money regardless.
[00:43:04] Speaker A: Oh, right, right. Yeah.
[00:43:06] Speaker B: He can get off as easy as you quote unquote, like it doesn't matter. He can't afford to pay that.
[00:43:11] Speaker A: That's true. That's a good point.
[00:43:13] Speaker B: So if you want to set a president, go ahead and shoot for the moon.
You're never going to see a nickel of that money out of this guy.
[00:43:19] Speaker A: That's a good point. That's a good point. I'd be curious to know what you all think of this case and whether this will actually go somewhere, whether that $6 million fine will stay where it's at or maybe some things will change. Maybe this will be something that we cover again in the future on Technato. So leave a comment, let us know what you think. I always love hearing that.
[00:43:36] Speaker B: I want to see his payoff day when he's like, I'm making my last payment for that $6 million funding FCC. Here it is.
[00:43:43] Speaker A: I'm free at the age of 276 making my final payment. So this is actually not the only behind bars segment that we've got today. We've got a couple articles that I felt like fit in this category. So this next one comes to us from bleeping computer and indian man stole $37 million in crypto using a fake Coinbase pro site. I believe he was arrested in Atlanta, I think. Yeah, at the Atlanta airport back in December. And so now he has just pleaded guilty to wire fraud and he's. He's going to do some time, I think. So this is not, I mean, this. I feel like we've covered similar stuff to this over the last several weeks relating to things like Coinbase pro, things like crypto. Like, it just seems like this is an ongoing issue, that at least people are getting caught and doing time for it. It's a positive, I guess.
[00:44:28] Speaker B: Yeah, this, this guy, well, this is a tale of Zulu's time as far as like how he performed his, his steal, his theft, which is create a scam or a fake site for the purposes of credential harvesting. Once you got creds, guess what I mean? That party's over. At that point, you just do whatever you want. You log in and start transferring funds. Because if I present a electronic system, an authentication system with the proper username, password and MFA code, it's going to go, well, it must be you, and come on in. That's all it does. Yeah, come on in. It has no other way to, like, see that you would not be you.
[00:45:13] Speaker A: Yeah.
[00:45:14] Speaker B: And, and that gets into the idea of how, how weak those systems are and whether or not we could come up with something better.
You know, based off of various and sundry different mechanisms have been purported that might be useful for that. But ultimately that was the. It was a simple scam, but very big payout. Right? This guy made $37 million doing this. Now, it wasn't just that all that money didn't go just to him. I think it was a group of people that scammed it because I will say they did have some fairly sophisticated, I say sophisticated. They had a good infrastructure built to perform their social engineering attacks and everything they needed to gain access to these systems because they would, they would get people to steer toward the fake portal. When they tried to log in, they would get an error. And that error said, oh, I think that's how. If I'm reading it correctly, they, they may have just received emails that said, hey, there was an, there's a problem. Please contact our support. And people. Do they contact. Excuse me, the support number that's given to them. And they were then greeted with a fake support person who was then saying, well, let me install some software or give me your two fa codes so that I can log in, give me your access code so I can do all this stuff. And of course, as someone who has worked on a help desk, I could tell, I could tell the person on the other end of the phone to give me their Social Security number, their routing number, their account number, their everything number. And they would do it.
[00:46:43] Speaker A: Yeah.
[00:46:44] Speaker B: Because they just trust that they are talking to people that are trying to help them.
[00:46:49] Speaker A: Yeah.
[00:46:49] Speaker B: That's unfortunately how they were able to manipulate people into getting access.
[00:46:54] Speaker A: Yeah. When you hear about phishing or phishing or whatever, that's usually one of the main things that they, that they use to get people to do what they want them to do. Right. Authority or urgency and fear and things like that. Promises of great riches, all that kind of stuff. So in this case it would be, yeah, hey, I'm a representative, I'm here to help you. And, oh, it looks like there's a problem with your account. We better make sure we go ahead and fix that. And yeah, if you don't know any better, you might. Or if you're worried about your account. Right. Your Coinbase account. Gosh, I guess, yeah, I really do want to fix this. I don't want anything to happen to my account so I could see somebody in a panic falling for something like this. But Coinbase has been defunct, I think, for at least a year or two. Yeah, it was shut down back in 2022.
This functionality and features being integrated into the main Coinbase platform. So Coinbase pro was the one that he was impersonating, that this guy and his team was impersonating and that's now defunct, but didn't stop him.
So 37 million, like you said, was the amount they were able to rake in from this. I bet you can't guess what he's being fined after talking about the last story that we had where he's being fined $6 million.
[00:47:53] Speaker B: $6 million for the $37 million fine.
[00:47:57] Speaker A: Right. That's the theme for today, the $6 million fine.
[00:47:59] Speaker B: Yeah.
[00:48:00] Speaker A: No, maximum prison sentence of 20 years, which is. That's pretty long, pretty hefty, but a fine of quarter million dollars. I feel like, I mean, I know these are different crimes. I feel like comparatively, to look at what the last guy did, not saying that was a good thing. Right. But the $6 million fine compared to this, in addition to the, to the 20 years in prison, which is nothing to sneeze at. Um, but I mean, he pled guilty, so he's, he's definitely going to serve some time and then probably pay, pay up a little bit, comparatively, I guess, to, to the money he stole and the folks that he tricked. It's not going to get that money back. So there's nothing you can really do about that. Um, bam, this guy's going to do some hard times. So.
[00:48:40] Speaker B: Yeah, 20, I mean, it says a maximum of 20 years. So he obviously has. Yet he's just been arrested. Uh, you know, they'll, they'll go to trial and everything and then after that they'll determine whether or not a, whether or not he's guilty because in America, you are innocent until proven guilty.
Um, and since he's only been arrested, he's only been charged, so he's not been found guilty of it yet. Right.
[00:49:01] Speaker A: He pled guilty. He was arrested back in December and he just pleaded guilty to wire fraud.
[00:49:07] Speaker B: What does it say that he now faces a maximum? Like, how come they didn't tell us.
[00:49:11] Speaker A: What he, they're still.
[00:49:12] Speaker B: The dates of the hearings are sentenced to have yet to be determined.
[00:49:15] Speaker A: Right. So he's. He did it.
[00:49:17] Speaker B: He admitted he's in deep doo doo.
[00:49:19] Speaker A: Then he copped to it. He.
[00:49:22] Speaker B: Hopefully they throw the book at him.
[00:49:24] Speaker A: Yeah, yeah, I agree.
[00:49:26] Speaker B: So put him under the prison.
[00:49:29] Speaker A: Okay. That's a new one. Yeah, put them under a new cell.
[00:49:32] Speaker B: Where it's just him.
[00:49:34] Speaker A: I learned something. I learned something new.
[00:49:36] Speaker B: Ripping people off, man, I'm sick of that junk, you know, cuz, you know, it's grandma that is mostly getting off out there.
[00:49:42] Speaker A: Yeah, it's.
[00:49:43] Speaker B: This was crypto people, so it was probably a bunch of hooligans anyway.
[00:49:47] Speaker A: Could have been. Could have been a grandma.
[00:49:48] Speaker B: I'm identifying with my attacker at this point.
[00:49:51] Speaker A: It's a bunch of hooligans now.
[00:49:52] Speaker B: I've been Stockholm.
[00:49:54] Speaker A: It's those nft people. Yeah, that's. It's going to happen. It's going to happen when you get.
[00:49:58] Speaker B: What they got coming.
[00:50:01] Speaker A: Well, believe it or not, we've got yet another story that I felt like fell kind of in the behind bars headline. Nobody's going behind bars. But this was an issue that had to do with prisons, courts and governments. So there's a courtroom recording platform, I believe that is. I'm going to call it jabs because it sounds like jeeves and I think it's fun. Javs hijacked in a supply chain attack. So more than 10,000 installations across prisons, courts, and governments. So a lot of people impacted by this, a lot of different facilities impacted by this is a Windows version of the rust door installer spreading via a compromised audio visual software package. So, a lot of stuff going on here. Daniel, maybe you can break this down for me a little bit, because this is one I know that you were interested in, and I had a chance to read through it a little bit, but I'm still a little bit lost on what exactly happened here.
[00:50:49] Speaker B: So.
Supply chain attack.
That's. That's the scariest words in the arsenal. I are, at least some of them, of the cybersecurity arsenal, is to hear that we have a supply chain attack issue. And the reason that that is a scary thing is because I didn't. I didn't attack you directly. I attacked software or organization in which you utilize their services in some way. So it became to you. It was legitimate. There was no way to tell that it was malicious in any way until the malicious thing occurred and that you detected it in some shape or form. Right? So. And that's. That's why supply chain. So if. If I want to go after Sophia and, you know, steal her credit cards and do all that stuff and have c two control over computers, well, maybe I find out that you're a music lover, and I hack Spotify. And I hack the Spotify app. I have access to the Spotify's apps. Codebase. I make changes to said code base.
And then the next time you update your Spotify app, guess what happens? You get the new code change that me as an attacker have made. I now have access to you. Right? And antivirus systems go, oh, that's Spotify. It came from Spotify. It's signed by Spotify or whatever I keep using. So nothing happened to Spotify.
Right.
[00:52:21] Speaker A: Example.
[00:52:22] Speaker B: Let's put it back in the context of jabs here. So, this place is called Justice Av. They are the vendor for the Javs software. Right. The viewer. So somebody hacked into justice avoid as a corporation. They had access to their code base. They made malicious changes to said code base, and then when the organizations like courtrooms and prisons and so on and so forth downloaded and utilized that version of the Jav software, they were now compromised.
So that is the problem. And it was a rust door, was what? And. Which is just a fancy malware written in rust that gave them the ability to listen and do other things with that software.
[00:53:09] Speaker A: Okay. So scary to that makes sense. I think so.
[00:53:12] Speaker B: You have any further questions?
[00:53:14] Speaker A: I always have questions, Daniel. You should know me well enough by now. I'm always going to have some questions. Some of them are probably going to be some pretty obvious questions.
[00:53:21] Speaker B: So Madam prosecutor, is it, this one.
[00:53:25] Speaker A: It says was a, I guess Russdoor targeted Macs and the Windows version was written in Golang and not rust. And so there was two different, I guess, versions of this that were operating. So the first malicious versions of these, the Javas viewer packages back in February that they started, people took notice of these. And rapid seven is the group that was investigating this and they first started investigating it back in May or beginning of May.
Javs has since removed those corrupted files, so that's good. And they did share that no source code or certificates or systems or anything like that were compromised in this incident. I hope that's true. It does seem like sometimes things come out later, so we'll have to wait and see. But they do recommend that any customers that use this software delete and replace it and then reimage their affected endpoints and reset their credentials. So basically let's just, let's just reset everything just to be safe, says this is high risk.
[00:54:20] Speaker B: Here's the question. They said that no source code was affected, right?
[00:54:24] Speaker A: They did say that, right?
[00:54:26] Speaker B: But then it says threat actors corrupted justice. AV's viewer version 837, which is used. How, how did they corrupt that?
[00:54:35] Speaker A: Good question.
That's a good question. I don't know if they talk about that at all.
[00:54:40] Speaker B: That, that's an interesting little tidbit of information.
Typically that's done through the source code because the attacker had access to said source code and could make the malicious modifications to incorporate the rust door or go view, whatever the heck it was. The go version.
Right. So I find that interesting.
I'm not saying it's impossible, I'm just saying I would love to know how they did that.
[00:55:07] Speaker A: Yeah, I don't, I'm looking at the, the rapid seven write up of it and I don't know if they mentioned how they actually gain access. They, they talk about, oh, once this is deployed, this is how it works, and this is what was corrupted, but not how we actually got to that point.
[00:55:22] Speaker B: So maybe what makes them believe that they did not have access to their certificates or key or any other thing that was in their environment?
[00:55:30] Speaker A: Hmm. I don't know. There's a bit of a bold statement to make, I guess, considering what happened.
[00:55:35] Speaker B: Step one and a breach. Deny, denied, deny.
[00:55:38] Speaker A: Yeah, absolutely. Right. So sid, they did say basically just, hey, just to be safe, this is reset, delete and replace. The software users are at high risk and should take immediate action. So a little scary, but don't panic. Just reset everything and you'll be fine. Uh, so I just thought it was interesting that I'm glad that you. That you brought this one up because, um, given the holiday weekend, we were kind of looking at articles, trying to pull stuff, and this was one that I had missed. And so I'm glad that we got to talk about this a little bit. Scary stuff. Now we've got a couple more articles we want to jump into. And this is another segment that we enjoy. Deja news. Deja news.
I would argue that's probably our grooviest segment. I don't know, maybe I'm wrong. So you might remember that we talked a little bit about this last week. Apple was not storing deleted iOS photos in icloud after all. So there was a bug, if you remember, that was causing folks deleted photos to resurface in their iCloud. And it's kind of concerning. And it wasn't just recently deleted photos. There were people that were saying, hey, I deleted these photos years ago. They keep popping back up no matter how many times I delete them. Apple was able to release a fix for the bug, but they never really talked about, hey, this is what was causing it. This is what happened. They fixed it and then that was it. So we now finally have some information, it looks like, on what was actually causing these reappearing images. So we'll jump into that. Daniel, I don't know if this was something that you had a chance to look at much, um, given, you know, Daniel's recovering right now. His arm is still healing, and he's got a lot going on. He's also got kids, and I don't. So I have more time to look at this stuff. And I got little kids. Little kids, right. Yeah, that's. That is an important clarification. So the reappearing images, it looks like because Apple was quiet about it, people were kind of speculating, what's the issue? Researchers were able to figure it out. They reverse engineered the update that addressed the problem, and we're able to take a look at what the update actually did. I have to go back in and look. They were. This is another bleeping computer article. It's a little bit short. It's not like an in depth explanation. Apple removed a routine in the function responsible for scanning and re importing photos. So it was causing it to re index old files and add them back to people's galleries even after they'd been deleted. So nothing malicious. It was just an oopsie, I guess, which is good. That's a good thing.
[00:57:54] Speaker B: The good part about it is that these were not going up into the cloud, I think, right.
[00:57:59] Speaker A: They were still in the file system.
[00:58:00] Speaker B: If I've kind of said, yeah, I don't need that anymore and I want it deleted, then that's good. They did not create some sort of weird backup in the cloud. And then that's good for us to know as customers of, if you're Apple customer, to go, if I have deleted it and then it is gone, I don't have to worry about Apple holding on to that.
That can be both good and bad, I guess. But mostly I think that's a good thing, is that if I want it gone, then I want it gone. And I should have say so over that. Apple's not doing some weird hanging on to things. It was just an odd misconfiguration in the iOS itself that was allowing for these to be retrieved locally. So that's good.
[00:58:44] Speaker A: And this was synactive that reverse engineered this and figured out what exactly the bug fix did. Apple has not technically confirmed this. They did reach out to them for some kind of a response, and Apple has not said anything as of yet. So technically no confirmation from Apple. But based on these findings, it looks like, yeah, okay. Apple's not storing deleted files on the cloud. That's good news. They're not accidentally being restored from the cloud, but they do mention it's a good reminder that even if you delete something, it could still be in the local storage. So good thing to keep in mind. But like you said.
[00:59:17] Speaker B: Yeah, if you've never messed around with digital forensics, this is what it's all about is the fact that some of these files have not been overwritten. They have not been anything. They're just flagged as free to use on the file system. So the file system just goes, okay, yeah, there's data there, but I can write over it if I want. Even then you might not write over all of it. You might just write over some of it. So just because you delete something doesn't necessarily mean it's deleted.
That's why we have safe practices and procedures for destroying digital things like hard drives, ram chips, and things of that nature that hold information on them. If I need to decommission something and it has sensitive information, I need to make sure that that data is gone. A lot of times that is done through physical destruction where I take the disk itself and I stick it in a meat grinder, that's literally what it looks like. It just kind of shreds the whole entire device, turning it into metal confetti. It's really cool. If you've never seen it, it's really, really cool to watch. Uh, you could stick a refrigerator in some of these things and it'll chew it up. It acts like it's nothing. Don't get your hand caught.
I will say that you keep your, because you're going to draw back a nub, it will take it. So just be careful around those things. You can also use magnetic tools for trying to magnetically erase them. And then there's the overriding capabilities of some things, like I think it's called is it z shred or shred is a tool in Linux. There's also bleach bit, which is another piece of software that guarantees. So basically, you ever write something on a notepad, think of it this way. If you ever wrote a message on a notepad or you've seen it in a movie, well, I wonder. The last person they called at this phone was, and they see the pad, but there's nothing on it. And you take a pencil and shade over and you can see the impression of the last phone number or message that was written down. Think of that as your data, right? Even though you deleted it, the impression of it is still basically there. And if I know how to shade it, I can bring that back up. Okay, but what if I write my message down, I pull that piece of paper off, and then I write a bunch of ones and zeros over where the impression of the message was.
And then I make lines going left and lines going up and down, and I make lines going sideways and crisscross.
And then I write the alphabets. And then I go over with another bunch of zeros and then some squiggles. You're going to have a hard time recreating the first impression that was imprinted onto that pad. And that's what a lot of those softwares will do, is they will create that environment where it basically just overwrites with a bunch of data, making it much more difficult to retrieve. I think there's even a DoD specification for how much and what type of overriding can be done to be deemed secure and destroyed securely.
[01:02:24] Speaker A: That's a good explanation. That's a nice visual. That that kind of puts it in perspective. You should be a teacher, Dan. You ever thought about doing training?
[01:02:30] Speaker B: I should look into that. Look into that.
[01:02:33] Speaker A: Doing some cyber security training. I think you'd be really good at it. But that's just me. Maybe I'm wrong.
[01:02:36] Speaker B: Might be a calling.
[01:02:37] Speaker A: Might be. Yeah, it might be. It might be. I think I know somebody that might be hiring. So I'll let you know.
[01:02:41] Speaker B: Yeah, that's cool. That's cool. Let me know. Send me that link.
[01:02:43] Speaker A: I'll keep you. Yeah, I'm a real one. I'll keep you posted. Well we've got, I think just one more article that we wanted to jump into. Wanted to give a quick update on that, that apple one because I know we talked about it last week. Didn't want to, didn't want to leave you hanging because there was an update, but this next one is a pretty big one. GitHub fixed a maximum security flaw in the enterprise server, in their enterprise server and it has a maximum CVSS score of ten. Oh, don't we love them. Those are the ones that it's, you know, I've mentioned before how like I'll see a score, six or seven. I'm like, that's not that bad. I can't look at this and say that it's a ten. It couldn't possibly be any worse. But it's been fixed. So critical vulnerability, uh, it looks like it was discovered via the GitHub, GitHub bug bounty program. So that's good because I guess that means it's not like oh, we saw this being exploited and that's how we figured out it was there. Somebody discovered it was reported and then they were able to fix it. So that's kind of best case scenario, isn't it?
[01:03:33] Speaker B: Uh, yeah, yeah, we like that. That's exactly how this should go. I mean best case scenario is that there was never a flaw to begin with. Well yes, but that's not a real world that anybody can live in. So this is the next best thing to that, is that yes, there was a flaw, there was a security issue, but our researchers discovered it. We've plugged the hole and all you got to do is a level update and you're back in action. So that's good for us. Really cool. This was a problem with basically their SAML, what's called assertions.
If you were using the encrypted assertions for your SAML authentication, then you're in trouble. This is where the trouble lied. I don't know why that happened. I don't think it goes into great detail on why that was the case. But regardless, neither here nor there, honestly, unless you're just very curious about that.
If you were not using the encrypted version of SAML assertions, you're fine. Or if you weren't using SAML assertions at all, you were fine.
If you were using SAML OAuth assertions with encryption, this is when you need to have one bead come down the old brow. You wipe it away, you update. Now that said, if I'm remembering correctly, this article also says GitHub has issued an urgent patch for a reason. Users of their enterprise software server software should prioritize implementing this and any other critical vulnerability patches where it's too late.
Was this the one that also said that you should basically just wipe it out, like wipe out your instance? And I don't know if it was this one, maybe it was a different one.
[01:05:21] Speaker A: Good question. I went to look at the breakdown of the vulnerability, how it'll show you the attack vector and stuff. And looking at the list, it's attack complexity low, no attack requirements, no privileges required, no user interaction and then everything else, system confidentiality, integrity, all that stuff is high, high, high. Which makes sense that it's ten. So if, if that is the case that they basically said you might want to just wipe it and start over.
[01:05:42] Speaker B: That wouldn't, okay, so it says that the encryption assertions are not enabled by default. So that means if they are on, you probably know about that because you enabled them. Oh, so that's good to know. So which is weird. Like wouldn't you think that the encrypted version, so the same l assertions are basically kind of like messages that get sent back and forth between the client and the server on authentication and how that's working, wouldn't you think you would want that to be encrypted? Like encryption is typically a good thing, but due to a coding error, it becomes your Achilles heel. How ironic.
[01:06:19] Speaker A: Where's Alanis Morris set when we need her?
Huh, that is interesting.
[01:06:23] Speaker B: Ironic is a lot of the ironies in her song were not technically ironies.
[01:06:27] Speaker A: Yeah, that's a good point.
[01:06:28] Speaker B: Weird.
[01:06:29] Speaker A: Would that be an irony in and of itself or is that another.
[01:06:32] Speaker B: That's what I said. That is ironic.
[01:06:33] Speaker A: Yeah, okay. An actual irony or an Aladdin Morissette irony.
[01:06:36] Speaker B: An actual irony.
[01:06:38] Speaker A: Oh, okay. Look at that. You're getting a literature lesson.
[01:06:41] Speaker B: My head is now spinning.
[01:06:43] Speaker A: Poor Daniel.
He's just trying to recover. He's trying to relax at home and get some stuff done and we're, we're giving him mental gymnastics to do.
[01:06:51] Speaker B: Yeah, I'm on pain pills, man. Come on.
[01:06:53] Speaker A: Oh, that's true. Yeah, I didn't think about that.
[01:06:55] Speaker B: That's.
[01:06:56] Speaker A: It's too bad. We're. This is not the time to be having Daniel do these. These dives into these articles, these mental gymnastics. Exactly. I have to look into whether this was something that they basically said, hey, you might want to just wipe and start over. But like you said, no, I don't think it does.
[01:07:11] Speaker B: It was a different article.
[01:07:12] Speaker A: Okay. Gotcha. There's just so much that happens every week. It's hard to keep track. So. But they do like they said, they did implement a issue, a patch, an urgent patch. So if you are using that software, probably want to implement that. It's been fixed in versions 3915, 310, twelve, 311, ten, and 312.4. So across the board, they've implemented this. They've patched this. You might want to go ahead and update, make sure that you've got that installed.
[01:07:36] Speaker B: I think that the total wipe was for the jabs. I'm pulling up the jabs article right now.
[01:07:41] Speaker A: Okay.
[01:07:43] Speaker B: Yeah. Although the rust or malware no longer spreading via the jabs platform rapid, suddenly rapid seven noted that the adversaries behind the supply chain attack are continuously updating, improving the command and control infrastructure. So you should not just delete and replace the software, but completely reimage affected endpoints.
That was for jabs.
[01:08:02] Speaker A: Okay.
[01:08:03] Speaker B: We got on a tangent.
[01:08:04] Speaker A: Forgot to mention that we did as we always do. We have no trouble yapping. I feel like.
[01:08:09] Speaker B: Yeah.
[01:08:09] Speaker A: Last week's episode, I think, went close to 90 minutes.
Our board director, Christian, was having to scrub through that and edit that. So we appreciate him. We appreciate it.
[01:08:18] Speaker B: We had a lot to say.
[01:08:19] Speaker A: We had a lot to say. We like to. Yeah. As far as this episode, though, I think that's pretty much gonna do it for our articles. If there's anything that we missed that you want us to cover again, a lot of times when these are released, there's stuff that's then come out in the meantime that we just don't have enough information on to talk about yet. So if there is anything you want to see us cover in the future, leave a comment, let us know. Let us know what you liked about this episode, which didn't like Daniel, would you like them to?
[01:08:40] Speaker B: They'll let us know if they didn't like it.
[01:08:42] Speaker A: That's true. They will. Yeah.
[01:08:43] Speaker B: Well, you have to ask for that. People. Just happy to.
[01:08:46] Speaker A: Yeah. I'm literally asking for it. If I get hated on that, it's gonna be great. It's gonna be great. I love seeing that. Daniel, would you like them to leave you. Maybe some, some well wishes urging your. Your arm.
[01:08:57] Speaker B: I appreciate any thoughts and prayers.
[01:08:59] Speaker A: And thoughts and prayers. Positive vibes.
[01:09:02] Speaker B: I'm doing all right, though. That's cool.
[01:09:04] Speaker A: I'm glad. I'm glad you're recovering nicely. We love having you here in the studio. It's fun having you here. So I am looking forward to hope next week.
[01:09:11] Speaker B: I'll be there.
[01:09:11] Speaker A: All right, let's go. It's going to be fun. There's also been holidays and things that have been happening and so it's been crazy. But as of next week, Daniel will be back here in the flesh with us and you'll get to see the rest of this studio. And let me tell you, we've seen it. I mean, it's pretty cool. It's pretty cool.
[01:09:27] Speaker B: It's pretty.
[01:09:28] Speaker A: It's very pretty. I think you guys are going to like it.
[01:09:29] Speaker B: Very pretty.
[01:09:30] Speaker A: If you're fans of network shows on like, ESPN and things like that. It kind of gives me that kind of vibe. So I promise it's not going to be a sports show after this. I just think the studio is.
[01:09:39] Speaker B: I got to wear, like a blazer and a tie, please.
[01:09:41] Speaker A: Yeah.
[01:09:42] Speaker B: Yeah.
[01:09:42] Speaker A: Dress like Stephen a. Smith or. I don't want it show up looking like my greenberg every day. But I think that's. Yeah, he's not going to do that. That's not gonna happen. His t shirts all the way. You're never gonna get us to show up in suits. I think that's pretty much gonna do it for this episode, though. So, Daniel, thanks for tuning in remotely again. We're really looking forward to having you back next week. And I mean that genuinely. I know I'm usually pretty sarcastic, but genuinely, we're excited to have you back. And thank you so much for joining us for this episode of Technado. Hope you enjoyed it and we'll see you next week for more.
Thanks for watching. If you enjoyed today's show, consider subscribing so you'll never miss a new episode.