Episode Transcript
[00:00:00] Speaker A: You're listening to Protect it all, where Aaron Crowe expands the conversation beyond just OT delving into the interconnected worlds of IT and OT cybersecurity.
Get ready for essential strategies and insights.
Here's your host, Aaron Crow.
Hey everybody. Welcome to Protect it all podcast. I'm very excited about today's episode. I've had the privilege of meeting Leslie in person and having some awesome conversations at Beer, ISAC and other conferences where she spoke or at defcon, all these kinds of thing. So Leslie, why don't you introduce yourself, tell us who you are and kind of what brings you here and how, how you got into this crazy thing called critical infrastructure and OT and cyber security.
[00:00:41] Speaker B: Hey all, it's, it's nice to meet you. If I haven't met you before, my name is Leslie Carhartt. I am a technical director at a company called Dragos that does industrial cyber security. And personally I've been doing incident response and forensics in the industrial and industrial tangential sector for over 15 years. That's what I do. I investigate power plants, trains, manufacturing facilities that get hacked and how it happened and how to prevent it in the future.
[00:01:09] Speaker A: And it's, it's, it's a lot of fun stuff, right? So you know, a lot of my history and my career has been, you know, the hard hats behind me have been in critical infrastructure and power plants. So you know, I was an asset owner, I supported 40 something plus power plants and nuclear power plants and fossil fuel ones and solar and wind and all of that. So the problems. And I've also worked for consulting firms and seen stuff, you know, across manufacturing and wastewater and you know, other industries as well. But you know, those, those critical infrastructures are so, they're so important to our, our, our country to, to, you know, just our normal way of life that we turn on lights and we expect it to work, but so many times the systems are old and we don't have enough people and resources and, and money and time and experience where we need to lean on people like you and Dragos that yeah, I know how to run a power plant, but I don't know what to look for for bad people coming in and doing things like talk about a little bit about that and some of the things that you guys experience or see and it doesn't matter if it's a big company or a small company, a wastewater, it's really the same across all of those types of things in this critical infrastructure.
[00:02:21] Speaker B: So I want people to take a few things away from this first of All I do this as a full time job. I know that when you look at the news, you only see the really big cases that get reported. They get regulatory reporting or they somehow make it to the news and you see them happening in industrial. Industrial systems are getting attacked all the time. I do this and my team does this as a full time job.
[00:02:42] Speaker A: Yep.
[00:02:43] Speaker B: I divide our case load into three general categories. The first one is commodity stuff. Now if you're looking at the industrial systems, if you aren't familiar with them, if you don't work in that space, they're kind of a layer cake. We call it the Purdue model. Purdue University made a model. It's a model, it's not a framework, it's just general description of kind, kind of how these things are laid out. The top level, the computers look like computers for the most part. There's been a lot of convergence between technologies. You see Cisco stuff, you see Windows stuff, doing important control stuff. Not the actual controllers, but the machines that show people the status, the HMIs of what's going on in the process, the things that are used to program the lower level devices, all those are starting to look like typical computers. And they're vulnerable to malware, they're vulnerable to ransomware, and it's a big deal to lose those. It's not necessarily going to make something catch on fire or explode, but it means you can't see what's going on in your systems anymore. And you might have to shut them down because it's not safe to run things that can kill somebody when you can't see if they're running safely. So there's the commodity stuff that's going on. Ransomware is hitting those environments more and more because like Aaron said, older stuff for a reason, because they have long life cycles and they require rigorous testing to put into production with life and safety implications.
And they tend to be less secure from a security design and security tooling perspective as well. You don't see much EDR for reasons again, rigorous testing, long life cycles, not modern antivirus, not modern firewalls in a lot of cases. So they're very vulnerable and they're doing important stuff. So those getting ransom, getting hit by other commodity malware worms, it happens all the time and it's a really big deal. Now at the lower levels of that Purdue model, you've got industrial controllers, PLCs, RTUs, things like that that aren't running familiar operating systems or protocols and those aren't necessarily getting hit in those attacks. Most of the malware doesn't need to touch that stuff. They don't need to go to the effort, expend the resources yet in a lot of these cases. But again, losing those higher level Windows systems, big deal. Really big deal. Showstopper for a lot of industrial environments.
And the second category that we see is insider stuff. There's a bad relationship overall in a lot of environments between the IT people and the OT people because of years of miscommunications and poor cybersecurity practices. Because we have to adapt to critical life and safety process environments and we haven't been doing that. So there's a ton of shadow IT and evading security controls going on in these environments. A lot of times they haven't been touched by security in a long time. So people will do things like connect modems, connect dual home systems, connect things to WI fi, add in remote access because they want to access things from home or they needed to during the pandemic. And there's not a lot of awareness in a lot of cybersecurity teams of all of these potential vectors for things to get into the environment or be tampered with. So that's a big deal too. Most of those are unintentional insiders. But we respond to horrible intentional insider cases too where somebody did something physically to somebody, hurt something, hurt somebody, and we have to investigate who did it and what they did. And the final category, state adversaries or state sponsored, state style adversaries. And what we're talking about there is mostly sabotage, preparation and espionage. So in terms of espionage and manufacturing, manufacturing, it's a lot of stealing corporate secrets, manufacturing secrets, things like that.
In terms of sabotage, it's mostly pre positioning for future sabotage. It's stealing enough data about these complicated safety controlled industrial environments over years that if there is a geopolitical cause in the future, there is the potential to do something malicious. Industrial environments are complicated. Again that layer cake. There's a lot of stuff going on and adversaries need to build these databases and access into these environments if they want to do something quickly in the future. And they are absolutely doing that.
[00:07:01] Speaker A: Yeah, yeah. And I see that so often is, is you hit on the insider. Right? And that's one that really hits to me. These engineers are doing what they think is right and what they have to do to support their environment. So whether that's secure mode access or you know, they got a netgear switch because it was 3:00 in the morning on a Saturday and their network switch failed and they can't get ahold of it because they' not supported during the weekend. So they go and they make it happen.
And they're not intentionally doing malicious activities, but they also don't know I just bypassed the firewall and put my OT environment directly on the Internet. They didn't mean to, but they did. And that brings in all these other cascading dominoing effect of issues that now some attackers, it shows up on shamoon and now I've got people, you know, impacting and I'm, I'm behind the, the curtain now, unbeknownst to me, until somebody finds it or there's negative impact or there's an assessment or somebody says why is that plugged in over there? That wasn't like that's new that that changed.
[00:08:03] Speaker B: Shadow it doesn't come out of nowhere. Shadow it, especially in industrial, is a byproduct usually of poor cyber security. Bottom line, end of story. If you have industrial stuff in your organization, and you probably do, even if it's just building automation and control stuff like your heating, cooling, data center, backups for your power, things like that, if you've got industrial stuff, you've probably been doing cyber security there wrong. Long time. Everybody has been. People are just over the last five or so years starting to rectify that. And people did shadow it because cyber security told them to do things they could not do. They, they couldn't upgrade systems because their vendor would put their system out of warranty or it would kill somebody.
[00:08:45] Speaker A: Right.
[00:08:45] Speaker B: And they needed remote access for a variety of reasons, especially during the pandemic. So Shadow it happens 90% of the time, probably because people don't have a good dialogue with cyber security. They don't have good communication, they don't trust cyber security. And there is something that is preventing them from doing their work in a logical way. That's, that's where that comes from, especially from mot. So when I'm talking about insiders, again, the vast majority of those are unintentional insiders doing something that majorly impacts cyber security with all these new threats. But they did it for a reason and they probably did it because we don't have a good relationship with them.
[00:09:23] Speaker A: Yeah. In my experience, both on, as a vendor, as a consultant, but also as an asset owner, that, that was the, the problem I was having as an ot, you know, owner reaching across the, the, the, the desk to my IT brethren and they'd be like, well, you have to do this and you have to go through my process and it takes six weeks and I'm just like, I, I don't have that time and I can't do that. And I, I can't use that model of switch or that model of server. It doesn't fit in my use case. Like, well, that's the only way we can do it. I'm like, okay, so I'm going to go around you because you won't let me go through you.
[00:09:56] Speaker B: Yeah. Once more for the people in the back again who are just on this, the beginning of this journey. Yes, we need to do cyber security in these environments. We have to do it differently because again, let's talk about these systems. Oftentimes they're bought for very long life cycles and they are tested to function in a certain way. That layer cake of technologies, you're buying them all together from an oem and they have been tested for years, usually to work safely together. The high level systems, the window systems, the protocols, the lower level devices, they were sat in a lab for years, typically with the oem, with them rigorously testing it in all different conditions to make sure it wouldn't fail and somebody would die. That's what's going on. Or the process equipment wouldn't catch on fire, get damaged, et cetera. You can't just swap out a part of that. You can't. There are maintenance windows, there are upgrades available from vendors, but those come in life cycles. They are not regular, they are not routine, they are out of band from your normal patching. You have to do updates differently in those environments. You have to schedule around maintenance outages. You have to work with the OEMs and with the operators and the safety personnel. All of that is possible, but you have to do it very, very differently. And if you try to force it in an IT way, things are going to go catastrophically wrong.
[00:11:12] Speaker A: Yeah, and you know, you talked about that, you know, testing out and it's beyond just in an IT world, you have an old laptop and well, you call it, and they just ship you a new laptop because the image is the same and most your stuff is stored in the cloud. And it's really simple, you swap one for the other. It doesn't really matter. Is it a Dell, is it an hp, is it an IBM, is it a Mac? It doesn't really matter. But in these OT spaces that's not the case. And I, I may have Windows XP running and there's a reason it's running because it's whatever the application or process that's running on it is not applicable with any other operating system. So if I try to put in Windows 11, yes, the server is new and it can be patched, but it doesn't serve the function and it, the whole system will shut down. So is it more important? And that's where this, this risk reward thing with, with OT comes in. Is there is a risk to, to a vulnerability or an attack or malware or all these types of things, but there's also a vulnerable, a risk to life, safety, availability of my system and sometimes having patching the system or updating with the latest OS or latest hardware is more risky to my business than leaving it alone and mitigating it in other ways.
[00:12:24] Speaker B: Yeah, I get called in on six year old, ten year old infections and networks. They've known that they were infected with Conficker or Salady or something for 10 years and they finally have the outage to clean it up and it's in everything. So it's a massive cleanup effort. But we could. Yep, that kind of stuff. Like how do we even approach this? How do we even clean 500 industrial computers that have been infected for 10 years or get an embedded adversary out? Somebody did risk modeling there. Oftentimes they talked to us or they talked to another cybersecurity vendor about who was in their network, what the potential was. They talked to the OEM about potential impact to their systems and they made a risk decision that shutting down their systems for two weeks would cost so much it would put them out of business.
[00:13:07] Speaker A: Yep.
[00:13:08] Speaker B: As opposed to leaving it infected. Now there's better things that we can do there. Things like partial containment, sometimes some segmentation, isolation of systems, you know, restriction of protocols at firewalls, things like that to slow the spread and maybe clean parts of the network so we can make plans for that. We have to do it very strategically with the subject matter experts. Again, you can't just install EDR on everything. You can't just upgrade everything, you can't just dump everything. Other security tools run them on these systems if they aren't vetted by the OEM and by the safety and the engineering personnel. So you have to make a really good plan there. It's doable, but you have to be cognizant of those actual consequences. Real life things, people dying, things getting damaged, products being damaged, the environment being contaminated.
[00:13:55] Speaker A: Yeah, they're huge implications. And that's, that's one of the bigger differences and especially for folks that aren't in ot, and I know we're talking to a lot of that right now is you, you may not understand or grasp the real difference and the analogy I like to give sometimes is like, hey, you're in an, you're in a plane, a commercial plane and you're flying and the aviation system needs a patch. You're in the air. Do you want them patching that aviation system when you're in the air or had you rather wait until the plane lands for them to try that?
[00:14:25] Speaker B: That was my first life. My first life out of high school was a, as an avionics technician. And that's where I learned a lot about industrial computers.
[00:14:32] Speaker A: Yep.
And it's an easy when, when you put it that way. And people like, yeah, I don't want to update that as I'm flying in the air.
[00:14:41] Speaker B: Our patching, updating while you're on the highway, is that, does that feel good? Do you like.
[00:14:45] Speaker A: No.
[00:14:46] Speaker B: Your computer system in your car, your media display shuts off and says I'm patching.
[00:14:51] Speaker A: Yeah.
[00:14:51] Speaker B: When you're using your map. No, of course not. And just that on a larger scale where you're talking about water going to houses or trains running or complicated, incredibly complicated. Just in time. Logistics and manufacturing systems we use today. No, you can do it, but you got to be cognizant of all of those risk factors.
[00:15:09] Speaker A: Yeah, well, you know, we've seen even this year that weren't necessarily, or they weren't cyber issues, but we saw train derailments and things from sensors on, on wheels. And there's all of these, these dominoing impacts of things that you don't necessarily. Yeah, you're doing one thing because you think it's good, but you don't realize necess the downstream impacts of those things and all. And that's why things happen slower in ot. It's just like, you know, one of the things I did when I was at the power company when I first started there is we had a bad weather event in Texas and we had to roll out weather stations and emergency satellite communications at all of the facilities. So I built, I rolled out working with vendors to put, you know, satellite dishes and satellite communications and these independent weather stations that are reading temperature and all that kind of stuff at every facility.
I did that across 48 sites in two months because we had a deadline with the Texas legislation that we had to do it within a few months. Right. So I got it done at all 40 something sites in that timeframe. And the nuclear facility, I had to install it in the parking lot. And why is that? Because I had to make a penetration into the control room. And anytime there's a Penetration in the control room. You have to have all of these studies. Why? Because they're tested very much around keeping radiation out so the people inside that bubble can survive if there's a nuclear reaction or something happens and they have to be in there to control it. So it took me a year to get the one at the nuclear facility actually installed.
And it wasn't surprising to me because there was six levels of documentation and engineering studies and all this different kind of stuff. But it was just a Cat 5 cable to 1,1 Cat 5 cable. Penetration into the control room took a year, a little over a year, year and a half to get it done on.
[00:17:00] Speaker B: Yeah.
[00:17:01] Speaker A: And people just. It blows people's minds when they think about that and how long it takes to do a relatively simple thing like running a Cat 5 cable.
[00:17:08] Speaker B: It requires infinite patience and a lot of diplomacy to do this job.
[00:17:11] Speaker A: That's right.
[00:17:12] Speaker B: I want to turn it around for a second though, based on what you just said for the engineers and the operators out there. Now, the thing that I want to express on that side of things is cyber security is a real and growing thing. Again, I said I do this as a full time job. What we're struggling with still is those operators and engineers who have thought about all these maintenance errors and human errors and safety implications of those. So there's tons of safety controls in these industrial processes to keep bad things from happening. There's digital systems like safety instrumentation systems and there's physical controls and there's human controls because maintenance issues, equipment failures, human failures happen. And the implications, like we've talked about, of things failing in industrial processes is really, really bad. So there's layers of safety controls to keep catastrophic things happening. And a lot of industrial operators are in the mindset that those safety controls are pretty infallible and that they aren't. You know, they don't think about cyber as a potential impact that can cause something to get past those safety controls.
[00:18:17] Speaker A: Right.
[00:18:18] Speaker B: These cyber incidents are happening where they're going to continue to get worse because people have figured out that this is a target. The media is a wonderful sphere for amplifying things to adversaries and adversaries who learn about an environment. They are engineers too. They are process engineers. Usually they're specialists on the systems and state style adversaries and well resourced criminal groups who can sit there and figure out how to evade safety control. And it's not in a lot of engineers or operators threat modeling right now. They aren't thinking about somebody tampering with their Safety instrumentation systems, then also giving them a bad reading on an HMI on purpose, and then tampering with a lower level industrial device to cause a bad impact.
[00:19:01] Speaker A: Right.
[00:19:02] Speaker B: So we need the operators and engineers to start thinking about cyber too as a potential route cause of things. That is a very big hurdle as well. We've got the cyber people who are thinking cyber, cyber, cyber. That's all they think. And you can't do that in ot, Right? You've got the ot, OT people who might be cognizant that there's hackers out there and they're doing bad things, but when they think about their low level process, they are not thinking about a malicious hacker causing the failures they see. So on that side of things, what we really need is the engineers and the operators to start involving cyber in difficult root cause analysis. When things don't make sense or things repeatedly fail in their environment, when they can't identify a cause for a system doing something potentially nefarious or bad, it's really important to start getting the cybersecurity personnel involved in that to identify if there is something else going on from that perspective. Because they've modeled for every type of human error and equipment device failure out there to keep people safe, which is great, that's awesome. But they haven't thought about a determined adversary purposely evading those controls, you know, like some of the physical ones. That's hard to evade. But a good engineer asks the engineers how they break things if they were a bad guy. There's always a way. So they've got to start thinking about calling in cybersecurity to be part of those investigations to make sure that's not. What, what's going on.
[00:20:31] Speaker A: Yeah. I mean, if, especially when you're talking about these nation states, they have potentially unlimited or very large budgets, at least there's no reason to expect that they don't have representations of this equipment sitting in a lab that they're sitting there beating up and saying, hey, I put it in this scenario, how can I break this safety thing? How can I get, how can I bypass, how can I avoid, how can I get around all of the things that. Because it's not surprising. Emerson, Siemens, they all have these standard architectures and it's done the same whether it's a nuclear facility or it's a manufacturing facility. Their control systems are usually deployed around the same way using same subnet ips, same architectures, all that kind of stuff. So it's really easy to understand if they've got Siemens and it's pretty easy to figure out semen. And I'm not trying to beat on Siemens, Siemens or GE or Emerson or whomever. Right. They're all the same. And there's a reason for that. Because they have to be able to support a, an architecture that, because they're so big and they have so many customers, it makes sense why they have a very, you know, standardized architecture that they spread across their environments. But for an adversary, that makes it really easy to understand what that looks like, recreate it and find ways around it, y'all.
[00:21:50] Speaker B: The, the period entry is lowering. It requires system knowledge too. To break a process, you have to understand the consequences that can occur in that process and all the safety controls and how it's set up. But Aaron just described these big vendors. There used to be some layer of security by obscurity because every environment was a little bit homebrew, it was a little bit custom. We're seeing a lot more standardization, we're seeing a lot more IT technologies like Window and Cisco, et cetera, in these environments. The barrier to entry, to launching attacks against these process environments is lowering. And the vendors are aware of that. Vendors are putting in more security controls. They're doing some application whitelisting in a lot of the big vendors now. They're doing good stuff. But again, when you're talking about determined, well resourced, not just state, but also criminal. It's a trillion dollar industry now. Ransomware, criminal adversaries are starting to have automated tools, knowledge of the systems they're able to buy expertise on those systems. We have LLMs now that will write you industrial code. They will write you ladder logic for industrial processes and tell you how to break it. Right. The barrier to entry to launching industrial attacks has been rapidly lowering. That's why I do the job that I do. And again, we've got to try to get ahead of this on both sides. We've got to fix all the problems with it cyber security coming into these spaces. And we've got to fix, fix the culture of it can't possibly be cyber on the OT side of things.
[00:23:16] Speaker A: Yeah. And you know, I've, I've talked about it, I talked about it at DEFCON last year and continue to. But you know, cyber informed engineering is, you know, something that comes out of the Idaho national labs. Right. And it's not a concept really that in my opinion started there, but they've really branded that and I love it. Right. And it really goes to. For me, what it means is cyber is just another risk thing that needs to be part of the conversation when they are designing that power plant or that manufacturing, manufacturing environment, they need to consider cyber in their design. Like what happens if a bad actor gets in here and turns that one to a zero? Gives you a bad reading. Right. And there. And the operator is going to be, and I've been in a power plant where the, the, the, the plant was running fine but the operator screen smurfed. So what did they do? They punched the unit out because they couldn't control it. They didn't know what was going on. So their training says to be safe, to make sure I save equipment and life, I'm going to shut the unit down safely. So then we'll figure out what's going on.
[00:24:13] Speaker B: Yeah, I mean that's, it's very significant there. A lot of swapping goes on, you know, like if something breaks it's. Again, the root cause analysis is we don't assume that it's cyber, we just swap out a box, we call in the OEM to replace the part. And there's never any forensic analysis done on a lot of those devices to figure out what went wrong. So yeah, I mean this is going to continue to get worse course. And I'm not trying to be fear mongering, you know, it's just like that's not, I'm not trying to sell people widgets here. Yeah, I work for a company that does this stuff. But like it's just logical like these, these systems do life critical things. They do things that are very noticeable in society and they make a good target for extortion, they make a good target for geopolitical objectives and people are learning how to attack them better. And we have to take that seriously. Our 97, 98, 99% of failures industrial devices in these countries that we're concerned about securing, are 99% of them still going to be maintenance failures, human error, things like that? Yeah, I get sick all the time of getting, you know, pinged on the Internet whenever there's some kind of industrial accident out there. Was it cyber?
Assuming it was Cyber, probably not. 99% of the time it's still going to be just, you know, a part failed or a operation.
[00:25:37] Speaker A: Human error.
[00:25:38] Speaker B: Yeah, yeah. But there is that small percentile, significant percentile where it is there are all these groups who are interested in tampering with these systems for a variety of reasons, whether it's making money or making a statement, whether it's terrorist groups or states.
And that is actually happening. And the only way we get a good idea of what they're doing, what their capabilities are, and where they are living in environments. I mean, look at the phone networks right now. That whole quagmire of unwinding intrusion and persistence and footholds in our phone systems in the United States.
The only way to catch that stuff is to do cybersecurity. We have to do cybersecurity in these industrial environments. And we have to take the stand that in some small percentile of cases, there is a cyber cause or some kind of element to what's going on in that environment when something fails.
[00:26:43] Speaker A: Yeah. And it's almost like in a perfect world, you would have a cyber person embedded in operations team or whatever, and anytime there's something that goes on, that person is investigating the cyber aspect. Could it be a bad actor? Could it be malware? Could it be ransomware? Could it be, et cetera, et cetera, et cetera? Instead of. And I'm gonna throw out a Hail Mary and assume that this is probably what you guys get most of the time. You guys are getting it in the, in the, you know, in the 11th hour after they've tried everything else and they've rebooted it and they've replaced stuff and it just keeps, keeps persisting and they don't know why. And they, they call you when they, they have no other options and they've tried everything else.
[00:27:28] Speaker B: They know we get called in so late and we have a lot of retainer customers at Dragos. So a lot of organizations with industrial networks keep us on retainer so that if they have an industrial incident for insurance reasons and for practical reasons, if they have an industrial incident in their OT environment, they can call us in, in. And a lot of organizations are still scared to call us for a multitude of reasons, like there's political reasons and regulatory reasons, like not wanting a big eye incident. And then also just that, not doing any investigation to know, to call us. And then they, they wait again, like you just described, until they've exhausted everything else. Or maybe somebody sees something weird and we get called in maybe months, months later, when all the evidence has been destroyed. We need to get this, we need to change this culture. And that means both teams need to work together better. In all organizations that have OT stuff, the cyber security people need to start really being cognizant of these environments and of industrial consequences and processes. And we've got to build a better culture of involving cyber security in root cause analysis, in OT as well.
[00:28:38] Speaker A: So without dropping names or specifications, take customers, etcetera, who's somebody that's doing it well, like who's somebody that's, that's kind of embodying that, you know, the thing that it's going to take to win in this space like this.
[00:28:52] Speaker B: Yeah, I mean it's really interesting. It's. I see every vertical around the world, different size organizations and some of the very well resourced verticals and companies like think about the big oil and gas companies, they have big cyber security teams and they can afford nice security operations centers with the best tools.
[00:29:10] Speaker A: Sure.
[00:29:10] Speaker B: But money can't buy everything.
In some other cases I see even small organizations where there's just that one CyberSecurity champion for OT who's really rocking, who's really motivated and has educated themselves and is building good relationships between the teams and that's really effective too. I've definitely seen Fortune 100 companies where the relationships between teams is a disaster and they have incidents because nobody's talking to one another and everybody's siloed. So it's a mix. I don't think there's one thing that that solves everything. But having a good champion for OT cyber security who seeks out the right expertise inside and outside and then having leadership executive buy in for that is a really big element. Yeah, resources help. Like I, I, if you ask us what industry keeps us up at night, it's water and sewage. Like people in the United States are not familiar with what it would be like to lose sewage systems for a month, two months. Like that is not something we are accustomed to thinking about.
[00:30:12] Speaker A: Right.
[00:30:13] Speaker B: How bad that would be or losing clean drinking water for an extended period of time.
[00:30:18] Speaker A: Correct.
[00:30:18] Speaker B: People in the United States, most of Europe, uk, Australia are not accustomed to that idea.
So, so yeah, it's a big shift in thinking there, you know, about the potential implications.
[00:30:34] Speaker A: Yeah, for sure. But it's promising to hear and for the audience listening, you don't have to have a huge budget to make an impact. You can be that champion that is somebody needs to own and kind of plant that flag and say I'm going to be that champion and get that, that ownership and build those relationships. Because it all comes down to building trust between OT and IT and you hit it in the beginning. Right. Is a lot of the reasons why we don't have that communication is because in the past they were untrustworthy and they tried to push something down or, or they failed to come through or whatever. That doesn't make it okay. We've got to rebuild those trusts. We're on the same team, we need to be fighting in the same direction.
[00:31:16] Speaker B: It's only going to get worse. We have to function as a, as a individual unit of doing this defense for these industrial systems. And you know, I didn't articulate well enough. Water utilities, if you don't know, are incredibly under resourced. Water and sewage are, are utilities that are usually municipal. They usually have like one IT person they have nothing to do cyber security with and that's very, very, very challenging for them. They have no money for tools for resources, for planning for personnel to do cyber security. So really scary situation in a lot of of the developed world right now. And the other thing there is, you know, yeah, we've got that low resources but we also see some really good community efforts there like water ISAC in the United States that are doing good community efforts to try to get them the resources they need. So again, motivated people who are doing good stuff to try to fix these problems and build better relationships can make a huge dent which, which gives us some hope because again water, sewage are the ones that keep us up because they have nothing, they have no money and they're doing this vital thing for society.
[00:32:21] Speaker A: Well, and again, you know, coming from critical infrastructure and you know, I've worked at the largest power utilities in the country and I've worked for small ones too and everywhere in between. And you hit on something really important earlier is it doesn't matter that you're the largest and you have a big budget. That does not necessarily mean you're, you're winning.
And sometimes it can add more, more difficulties because of layers of management or whatever red tape that makes it more difficult than some of the smaller municipalities with one guy, well, he's the guy or gal he, they are the one that can, can make a big impact instead of having to have a committee of people to sign off.
[00:32:58] Speaker B: But it does help to have money for like antivirus and things.
[00:33:01] Speaker A: Exactly.
[00:33:01] Speaker B: Does help to have a detection tool or something to have.
[00:33:05] Speaker A: Sure thing.
[00:33:06] Speaker B: Somebody who can monitor your systems full time.
Those, those things cost money. So it's, it's a mix. It's a mix. Again, there's not a single answer. But caring about this problem and thinking about it strategically with an involved involvement from both sides of this, this issue is a huge element and that's somewhere to start for a lot of organizations.
[00:33:26] Speaker A: So, so speak to that person. Speak to that person. That is the, the OT person or maybe they want to be and they're trying to make a difference and they don't know where to start. And maybe they don't have a budget, maybe they don't even have authority. But they're, they're, they're, they're interested and they know they have problems and they want to help. Like, how can they best start moving that rock uphill?
[00:33:48] Speaker B: So from an organizational perspective, it's important to have an OT cyber security champion. Maybe they're one person in your soc. Maybe there's somebody you hire to be in that role part time, full time, something like that. Maybe it's part of their job and that's half their job or something. But it's important to have somebody who owns that relationship. And then that person needs to do a lot of shadowing of the OT team. They need to understand the process, not as an engineer, chemical engineer, electrical engineer, but understand it well at a high level, understand what can go wrong, the consequences, how the system is laid out, how people do their jobs. And then they also need to be a cybersecurity person. They also need to be able to associate with and get expertise from other cybersecurity people and call in vendors when they need to, whether that's industrial vendors or cybersecurity companies like us to figure out what's going on in their environment and how to best secure it and respond if there's an incident. So having that, that champion is really good for a facility or for a holistic organization. It depends on how you lay out your environments and how different they are. But it's something somebody should own and they should really care about and they should learn both sides of this problem as much as they can at a high level so that they can pull in the right expertise to solve these problems.
[00:35:03] Speaker A: Yeah, that is so, so true. And the other thing that comes up for me in that is the language difference between OT and it. And even though we're talking firewalls and we're talking the tech is very similar, especially now in many of these spaces. You said big eye incident, right? And there's a reason why a power company is hesitant to call something an official incident, because that means certain things. And there's a timer that starts and there's all of these requirements that go into when they know it's an incident.
And so when I've done tabletops with people that are non OT or yeah, and they're like, well, you should just call the incident response team and declare it an incident. Like, yeah, timeout. I don't want to call an incident yet because I don't know that it's an incident. As soon as I declared an incident. Then there's all of these things that happen from a regulatory perspective and those, those can be big impacts, especially if it's not true. So those are, those are big things around language and understanding and trust and all that kind of stuff too.
[00:36:04] Speaker B: I have a very nuanced view on industrial cybersecurity regulation around the world for a couple of reasons. First of all, sometimes it deters investigations and monitoring because nobody wants to detect anything.
They will do the bare minimum required in monitoring so that they don't have a big eye incident. They have to report to either shareholders or the government.
And you know, the other thing is requiring people to do things that they don't have resources is for I talked about like those municipal utilities like water and sewage, the ones that have no resources. If you regulate, they have to do more cyber stuff and you don't give them any more resources or tools that's going to take away from something else they're doing.
[00:36:45] Speaker A: Yep.
[00:36:45] Speaker B: It's probably not going to be process stuff. It's probably going to be like patching, it's going to be like some IT thing they're going to stop doing to do your cyber security mumbo jumbo regulatory stuff. Regulation has a place. Place. Yes. We really want people to report incidents to the government so we know what like state actors are doing to our critical infrastructure. So we don't get into the situation we are with the United States phone systems right now. That's a big deal. We don't want to end up there 10 years down the line. We need to know what these adversaries are doing and where they have footholds. So we really need to be cautious though. We want to get that reporting. We want to make sure we incentivize people to do cybersecurity. But we have to balance that with scaring them away from even doing detection and taking away the resources they use for other critical things to cyber security like updating their systems and building new architecture. That means regulation needs to come with resources, both educational resources and money. Money, money and people to help these organizations get where they are and where they need to be. And they need to be written really well so that reporting is still possible. Detection is incentivized and people are not disincentivized from doing basic cyber security because they're scared to have an incident correct problem.
[00:38:08] Speaker A: It is, and I saw it, you know, and just to do an analogy or an aside, I saw it in the safety culture in critical manufacturing and critical infrastructure. Right. Is, is once we had this Safety zero. And, and we in the organizations that I'm with empowered their, their people. Whether you're an intern or you're, you're a 40 year person to stop on unsure and to be able to challenge anyone, hey, that's not the site. Like don't stand on the chair. Go get a, go get a ladder and put on a harness and you know, make sure you're locking out and tagging out. It's amazing how when we started doing that and they empowered and there wasn't any for doing these things that people turn things in and they said, hey, we had a near miss today because Bob was in his chair. I safely got Bob down. I told Bob that he shouldn't stand in his chair. We should go get a ladder. That was a near miss. And Bob didn't get fired for it. Bob didn't get in trouble for it. In fact, it was a. Okay, Bob. Now you know not to do that. If you do it again, yeah, there's probably going to be some retribution, but there shouldn't be this negative beating over the head or fines or anything like that all the time because then it, to your point, it's going to stop people from wanting to raise their hand or look for, for as long as I don't know it's there, then I don't have to report it. So I'd rather just not know.
[00:39:23] Speaker B: And also that culture of Bob, why were you standing on the chair? Can you get to a ladder? Do we not have enough? Is there not one in your workspace you can get to quickly when you have a problem? Do we need to buy another ladder?
[00:39:33] Speaker A: Right.
[00:39:34] Speaker B: That stuff happens in cyber security too. If we detect something late, if we don't detect something until there's a root cause analysis done, what were we missing? This isn't somebody's fault. We're not firing the CISO of the organization. Like, like we're all building OT cyber security together as a planet right now. Like this is, this is a fast growing problem that's been growing for a while and it hasn't been attended to for a while. We're all, we're all doing this together. There's not a lot I know, like all the people who do my job.
There's a limited number of people around the planet right now who do what I do. So we, we have to change that culture. We have to incentivize people to determine, protect and report. And people who are in legislation in various countries really need to be cognizant of that being a really big problem right now.
[00:40:22] Speaker A: So you, you hit on something really important in there as well. And it's the resource thing there, There definitely is not very many people that have the skill sets and experience in OT and IT and, and cyber and, and have the diversity of experience experiences from. You know, again, I came from working in power plants and all that kind of stuff. And I also worked in, you know, Fortune 100 IT enterprise organizations. So I bring both of those things to the conversation. So when I'm looking at those problems, I have those experiences. But there aren't many people like me and like you and like others that we know. How do we, how do we grow that and get more people involved in more diversity of thoughts and ideas and experiences? Because when you look at the resume or the job requirements on some of these things, things it's like, well, you need 20 years experience and a CISSP and blah blah, blah. And like there's like 12 of us that have those requirements and we all are currently employed. So what are you going to do when none of us want that job?
[00:41:19] Speaker B: Yeah, I mean that's a really interesting problem. And I do a lot of mentorship. I've actually got a calendly on my social media where I let people sign up for mentorship sessions with me. And I get asked regularly, like, how do you get into OT security? And it's tough because we all have these really strange backgrounds. Like you were in electric power. I was, was an aircraft mechanic. Like you need to combine those two sets of knowledge and let me describe the two sets of knowledge for you. One of them is cybersecurity. Whatever your niche of cybersecurity is, whether it's monitoring, detection, incident response, whatever, architecture, network security, engineering, things like that, you need to combine some level of skill with that. It could be entry level skill and some knowledge of industrial processes. The mistake I see a lot of people do is they focus on like industrial protocols. Like they go buy a bunch of PLCs and they like exploit the PLCs, y'all. Let me tell you something about PLCs. You're gonna exploit the PLC? Yes. Send them a stop command. They stop. They are very simple computers. They are very vulnerable. I describe these industrial environments usually as a M M. The M M model is a crispy candy outside and a gooey candy center. You can exploit those unencrypted industrial protocols, that's fine.
But you know, the reality of it is what can you do to the process? We talked about this earlier. Processes are complicated. They're getting Less so because there's a lot more standardization. But really what matters there is what causes somebody to get hurt. Hurt or to the. The equipment to get damaged or the environment to get contaminated. Whatever those consequences that come come from those processes that are really, really bad. That's what you're worried about. And from a cybersecurity perspective, you're not worried about exploiting a single plc. They're usually redundant. There's usually three layers of physical and digital safety controls keeping them from doing something really bad or shutting down the processes if they do.
A good red teamer and red team is another niche of cybersecurity. Who's learning how to exploit these systems is thinking about, how do I make the place catch on fire? Not really. I mean, just from a tech perspective. But like, how would an adversary do that? How would they evade all these safety controls and the redundancy in the system and the logic in the system to make this consequence happen? It's not exploiting a single plc. Do you need to know how to do that? Yeah, but every system's got different, you know, code, different ladder logic, different protocols, and you can learn those pretty fast. Really, what you need to focus on is that second set of knowledge is how does process work? I don't care what process it could be canning tuna, it could be getting an airplane off the ground. It could be electric power generation, distribution, transmission, etc. It could be how cars work. You need to understand how a process works, right? How safety controls work, what the stuff in a general process is it. It's the same, like, they're different vendors. There's different layouts in, like, tuna canning and airplanes flying. But process environments are still kind of in the Purdue model. They still have that layer cake. A lot of the protocols are similar.
Once you understand how a process works, you can usually understand how other processes work. Work. You don't need to be a chemical engineer. You don't need to be an electrical engineer. Focus on that. Focus on learning that. If you're a cyber security person who wants to get into space, go find a process. You can learn about whatever's cool to you. Go learn how trains work. You know, go find a friend who works in a manufacturing facility and see if you can shadow them for a while. You know, find some way to learn how stuff works, and you'll be able to figure out how things. Things can be broken. Yeah, that's, that's, that's what my advice is there?
[00:45:05] Speaker A: No, that, that's great. And, and to your point, right, is you know, I came from critical infrastructure and, and power generation and transmission, distribution, all that kind of stuff. But over my career I've worked in all the 17 critical infrastructure verticals. Right. And I can walk into. Even though most of my experiences from a power plant, I can walk into a critical manufacturing. And I understand it because I understand how the processes work.
[00:45:28] Speaker B: Work. You send them a stop command. They stop when you get on the wall. That's got to work. Like, yes. There's no encryption on these protocols. Like, yeah, the protocols can be. You might have DNP3 in your power environment in the US you might have Modbus somewhere else. Like, yeah, whatever. There's simple industrial protocols that you can look at in a, in a packet analysis tool.
They all have HMIs that share the status, status of the system. They're oftentimes Windows system. They all have engineering workstations to program stuff. They all have some PLCs, RTUs at the lower level. Sensors, actuators.
Process is a process. Learn how process works. Listen to us, listen to me. And Aaron, you can go learn this stuff, but don't focus like, oh my God, I'm going to hack a plc. Like, oh, I found a PLC or an HMI or something exposed on Shodan. Yeah, you can, you can hack it, but like hitting the buttons on the hmis, there should be levels of safety controls that if you try to. Yeah, you found a water HMI on Shodan. Okay. Yeah, there's a lot that exposed and that sucks. And that's a bad thing for cybersecurity. But most of them have layers of controls that keep you from like changing the chlorine levels too drastically. Like there's other stuff there. It's a whole thing. It's not just that one system that you have access to. So that's an element of understanding how to do cybersecurity either.
[00:46:49] Speaker A: Yeah, yeah. And, and if you're going to set up that lab and you know, there's nothing wrong with having some fun and hacking a plc, but more understand how to make it work the way that it should work in, in a, in a proper way and how it can control things.
[00:47:01] Speaker B: Vacuum?
[00:47:02] Speaker A: Yeah.
[00:47:02] Speaker B: Your PLC. Yeah. Take one apart. Sure. I got some PLCs on my desk. They're fun to play with.
[00:47:07] Speaker A: Y.
[00:47:07] Speaker B: Build yourself something at home with ladder logic. You know, make something. Brew your coffee or something. Sure, that's great, but that's really. You need to understand, understand how you brew the coffee.
[00:47:16] Speaker A: Right.
[00:47:16] Speaker B: How does the coffee maker fit into this?
[00:47:18] Speaker A: Right.
[00:47:19] Speaker B: Not just the plc, it does not exist in a pile on your desk. And that's it. It's connected to things.
[00:47:25] Speaker A: Well, especially in these critical places. Right. As we have, you know, backup and tertiary systems. I mean, you again. I support a nuclear power plant and that we had analog things. Like there was analog and then there was digital and then there was a turtle, tertiary digital. Like so even if I hit one, there were two other systems that protected it. And if it got. And there were thresholds, even if I could say ramp up, chemicals are down, it could only do it to this amount. And then the other ones would say, yeah, you can't do it that fast. You're out of here. Or there's something outside of normal. I'm, I'm failing to analog and I'm going to send an operator out to move, physically move a valve, for instance.
[00:48:02] Speaker B: And I, I'm going to use the word. I haven't used the word in like 10 years. And of these podcasts. I'm gonna go back to Stuxnet. Yeah, I can't believe I'm gonna do it. They were talking about nuclear stuff, uranium enrichment there. Look how much went into figuring out how to break a nuclear process there.
Uranium enrichment, like people had to figure out a very out of the box solution to screw that up.
[00:48:29] Speaker A: Yep.
[00:48:29] Speaker B: It involved changing the spinning of centrifuges in the enrichment process. The holistic process of like creating nuclear power and creating nuclear weapons involves enrichment. That's one aspect and that is its own process. And somebody had to have the engineering expertise to look at that segment of a process and then say, hey, with all these safety controls in place, with all these human controls, this incredible levels of redundancy that you're talking about, what can we do that nobody's going to know notice right in and evade detection using these analog controls and these physical controls and these human controls in an air gapped environment. That's an extreme example. Nuclear is incredibly secure for a reason, because really bad things happen if it goes wrong. But you know, like process is a process. They are complicated. They have layers of safety controls and these adversaries who are thinking about attacking them. They're looking for low bars to entry, low barriers to entry, but then they have have to think about the process and that's the hard thing.
[00:49:31] Speaker A: Yeah, yeah, absolutely. Man, this has been eye opening to people again. So I've actually seen Stuxnet running in the wild in a power plant, but it wasn't, it didn't have Siemens S7, so it really wasn't dangerous. Right.
[00:49:46] Speaker B: The Right. Centrifuges.
[00:49:47] Speaker A: It's not exactly. So it's there. It's like, oh my God, we have stuxnet. Yeah. But it doesn't really do anything. So yes, we'll clean it, but it's not. I'm not going to shut the plant down because of it.
[00:49:58] Speaker B: It's amusing that they got that in there. That's, that's, that's, that's awesome. No, it's not awesome, but wild. Yeah. But yeah, that's a very old case. We don't talk about it a lot anymore because it is so long ago. Almost 20 years ago now.
[00:50:12] Speaker A: Right.
[00:50:12] Speaker B: But that's like the extreme example. Like again, I do this every day and people are attacking this stuff. Normally it doesn't take that level of effort.
[00:50:21] Speaker A: Yeah.
[00:50:22] Speaker B: Most environments are not that secure. They have Windows XP systems that are exposed to the Internet. They are not hard to get into. You're still dealing with those fundamentals in most environments that aren't defense or nuclear. So much lower barrier to entry. Much lower barrier to doing bad things to the process.
[00:50:39] Speaker A: Yeah, and that's going to be kind of my theme here is, you know, there's usually when I come in and do an assessment or whatever, you know, it's not fancy. Many times like, hey, where would I start? The foundations. Do the basics. Go look and do you have an asset inventory? Do you understand where all your assets are? Do you have a network diagram? Do you know where you know, look at your firewall rules, look at your routing. Make sure you don't have something directly connected to the Internet. These basic things are not super expensive. I don't have to hire, you know, EY or, or Big four consulting to do some of these things. These are things that I can do, you know, in my downtime and to make sure and, and get a, and raise the level of my cybersecurity in my environments exponentially without having to buy a single product.
[00:51:25] Speaker B: If I would challenge everybody who's listening to this, and thank you for listening, especially if you've gotten through an hour with us. I really appreciate it. I hope you've learned something. Try. If you are cybersecurity professional, try to identify where industrial devices exist in your workspace, in your facilities. Even if you're not in manufacturing company, you probably have data center industrial controls that do very important stuff for your operations. Identify where those industrial systems are and if anybody's doing security on them. And then start finding a champion, building some kind of fundamental program. Figure out what's there. Like, Aaron, just Described an asset inventory. Don't scan things intrusively but like, you know, passive discovery.
Try to figure out what's out there, try to find resources and help to understand what's going on in that process environment and just start from the fundamentals. Anything's better than nothing. I implore you all. Like again, I see people on their worst day ever, all the time. Start somewhere, start with the basics. There's low tier adversaries that you can catch just by doing the fundamentals environments.
[00:52:32] Speaker A: And it's amazing how many you know of these vulnerabilities or these, these, these risks kind of go away when you do those basics that don't, that don't take much when you, when you start thinking about them, when you're looking at these issues.
[00:52:45] Speaker B: Deterrence is a real thing. Yeah. You know, a lot of these criminal actors like if you are too hard of a target and it's not worth the monetary effort to break into you, they'll move on to another target, easier target. It's not a be all, end all. I hear that crap like oh, the adversary only has to succeed once and defenders have to succeed all the time. No, it's a layer of defense and depth. Like you are deterring more adversaries. The more you do, the more you do, the more stuff you'll catch and the more people will find you too time consuming and money consuming to intrude into and they'll move on to somebody else. Right. This is a doable thing, but you have to start somewhere in your industrial space and you have to do it a little bit differently.
[00:53:25] Speaker A: Yeah. Awesome. So with all of that I'm going to give you the final wrap up question that I always give everybody. What in the next five to 10 years, what's one thing that you see coming up over the horizon in cyber that's maybe concerning and one thing that's exciting that, that from your perspective, so.
[00:53:42] Speaker B: Concerning these attacks are only going to continue to get worse because people who are doing bad stuff see them in the news and see if they're successful. Like you want to cause an impact, you want to get, you want to extort people. Bringing down critical infrastructure, manufacturing, logistics is a great way to do that and it's incredibly powerful tool for states as part of larger geopolitical and conventional warfare campaigns.
[00:54:03] Speaker A: Sure.
[00:54:04] Speaker B: When it's the most cost effective thing to do, which sometimes it is, sometimes it isn't. So that's only going to continue to get worse. These problems of convergence of technology, standardization of systems, we're starting to see commodity toolkits and malware that has the capability to hit malware systems. That, that definitely happens. Happens. People are starting to build automated toolkits to attack standardized industrial configurations and that's really bad.
But on the positive side, I'm getting a lot more retainer customers. I'm getting a lot more people calling us for architectural advisory services, fundamental incident response plan construction, things like that. So, and I'm sure that our competitors out there who are in the industrial space are too, like, that's good. People are starting to care. We're seeing, we're getting called in a little bit earlier in some organizations to be part of root cause analysis. That's good. People are starting to have us take a look at long term infections that have been there for 10 years. That's good. So defense is growing, attacks are growing. They're both happening. It's two sides of the same coin, unfortunately. But I mean, I'm glad that OEMs and industrial operators, owners are starting to care a little bit more about cyber security while the adversaries are also ramping up targeting of those systems.
[00:55:23] Speaker A: Yeah, I think the same canary in the, in the coal mine, that's, that's, that's making it, you know, getting more budget and more people from a, from a defensive side from these companies. Obviously the adversaries see the same thing. So they see that it's working so they're going to do more. And then the more it's just this, you know, tail, you know, snake eating its own tail.
[00:55:45] Speaker B: We're both going to be employed for the rest of our careers.
[00:55:47] Speaker A: Exactly.
[00:55:49] Speaker B: I'd like to go become a bartender or teach karate for the rest of my life, but that's not gonna happen.
[00:55:53] Speaker A: That's right. That's right.
[00:55:55] Speaker B: Very, very busy for a long time, but that's right. So things can get better.
[00:56:00] Speaker A: Yeah, it can. And, and you know, I've seen in my career, I've seen a drastic improvement in this in all spaces. Does that mean there's not a lot of work to do to your point? There's a ton of work to do and we're way behind the eight ball and that's not where I wanted to be. But we're here and I've seen a lot of really great people and a lot of really great conversations and a real, a lot of great work that's being done in this space. And I, and I want to empower by using this platform, by putting people like you on this and having these conversations that people don't get Frustrated. They don't just feel like I can't do anything. It's too big of a problem because it's not. It is a big problem. But you can, you know, you can be the ripple that makes it makes an impact in your organization.
[00:56:41] Speaker B: Agreed.
[00:56:43] Speaker A: So what's call to action for you? Where are you going to be? I know you've got all sorts of things going on in your world, so what do you want people to know or reach out to you with? You already mentioned your calendly link so that that's really cool for people to take advantage of.
[00:56:54] Speaker B: Yeah, I'm hex for pancakes on all social media minus X. So master Don Boost. I. I'm on LinkedIn, I'm on Instagram, all that stuff.
So feel or threads, feel free to reach out to me through any of those. I also have my calendly link if you want to get mentorship from me. I do mentorship clinics around the US conference circuit currently but the big news for me is I am moving to Australia within the next five, six months. I'm going to be based in Melbourne and I'm excited to be part of the conference scene and mentorship scene there when I arrive.
[00:57:29] Speaker A: That's awesome. Hopefully I'll get to come down there and see you at one of those conferences conferences instead of just the US US based ones that, that we've spent so much time on on this shore. So. Well, thank you for your time today. I really appreciate it was an awesome conversation. I think it was super valuable for folks. So definitely reach out.
Can't recommend it enough to reach out and ask questions and you know, if you're looking for mentorship, you know Leslie would be a great one to reach out to as, as you heard from in this conversation. There's there's a lot of valuable knowledge there and, and insight that that happy to share as all of my guests seem to be. Which is why I love doing this. It's not hard when it's fun.
[00:58:10] Speaker B: Thank you so much for having me. It was so much fun.
[00:58:12] Speaker A: Absolutely. Thank you for your time. Thanks for joining us on Protect it all where we explore the crossroads of IT and OT cyber security.
Remember to subscribe wherever you get your podcasts to stay ahead in this ever evolving field until next time.
[00:58:27] Speaker B: Time.