“In a perfect world, you’d have secured by design. Whatever we developed would have security integrated. From day one, it would be secure. But insecure by design is much worse than a lack of secure by design. What it means is everything that an attacker would want is a documented feature and function of the system. So, an attacker doesn’t have to come in and find some vulnerability to exploit and find a way to cause this damage. Essentially, if they have the engineering skills and the automation skills, they can do whatever they want.”

That’s Dale Peterson, who is on the leading edge of helping security conscious asset owners in a range of sectors effectively manage and reduce cyber risk to their industrial control systems (known as an “ICS”). ICS is a computer system that monitors or controls a physical process. They exist everywhere: power generation, water supply systems, transmission, product manufacturing.

Dale joined the NSA as a cryptanalyst, a codebreaker. After six years that he can talk little about, he moved to the commercial network security industry. In the past 21 years Dale has helped secure hundreds of companies’ industrial control systems (ICS) in every sector … oil/gas, electric, water, manufacturing, mining, building automation. And that’s just the beginning.

(Subscribe to Hack the Plant on Spotify or Apple, by RSS feed or search for it wherever you listen to podcasts.)

Transcript:

Joshua Corman:

Our dependence on connected technology is growing faster than our ability to secure it, especially in areas affecting public safety and human life.

Bryson Bort:

I’m Bryson Bort. And this is Hack the Plant. Electricity, finance, transportation, our water supply. We take these critical infrastructure systems for granted, but they’re all becoming increasingly dependent on the internet to function. Every day I ask and look for answers to the questions. Does our connectivity leave us more vulnerable to attacks by our enemies? I’m a senior fellow at the R street Institute and the co-founder of the nonprofit ICS Village, educating people on critical infrastructure security with hands-on examples, not just nerd stuff. I founded GRIMM in 2013, a consultancy that works the front lines of these problems every day for clients all over the world.

Bryson Bort:

For today’s episode, I’m joined by Dale Peterson, who is on the leading edge of helping security conscious asset owners in a range of sectors effectively manage and reduce cyber risk to their Industrial Control Systems (known as an “ICS”). ICS is a computer system that monitors or controls a physical process. They  exist everywhere: power generation, water supply systems, transmission, product manufacturing.

Dale Peterson:

It’s still not really understood widely is that these systems for the most part are insecure by design…for a long time, there were systems that were put in and then not touched for decades.

Bryson Bort:

We’re here today to talk about some of the key cyber vulnerabilities in these systems.

Dale Peterson:

In a perfect world, you’d have secured by design. Whatever we developed would have security integrated. From day one, it would be secure. But insecure by design is much worse than a lack of secure by design. What it means is everything that an attacker would want is a documented feature and function of the system. So, an attacker doesn’t have to come in and find some vulnerability to exploit and find a way to cause this damage. Essentially, if they have the engineering skills and the automation skills, they can do whatever they want.

Bryson Bort:

And to talk about Dale’s work to modernize ICS security through the S4 Conference and community – bringing together assets owners and experts.

Dale Peterson:

This industry is very conservative, very slow moving, very resistant to change. One of the things that I’m trying to do, probably my biggest thing right now, is to shake that loose…Anything that could change that predictable operation is considered a threat, is considered something that unless there’s some big benefit associated with it, they just don’t want to touch it…As this little cloistered operations group is being integrated in with the larger technology part of the organization, that’s changing, not as fast as a lot of us would like, but it is changing.

Bryson Bort:

We also discuss the relationship between the government and the private sector, how CEOs and other decision makers should evaluate and deploy resources to deal with ICS cyber threats, and the importance of regulators developing metrics for improving cyber security relative to ICS systems.

Dale joined the NSA as a cryptanalyst, a codebreaker. After 6 years that he can talk little about, he moved to the commercial network security industry.

In the past 21 years Dale has helped secure hundreds of companies’ industrial control systems (ICS) in every sector … oil/gas, electric, water, manufacturing, mining, building automation.

In 2007 Dale created the S4 Conference because there was no event where ICS security research could be presented to an audience that understands it

Bryson Bort:

All right, Dale. Well, welcome to Hack the Plant Podcast. To start off, if you could tell us a little bit about your background and what you’re working on now.

Dale Peterson:

Well, I do two main things now. I help asset owners in a variety of sectors figure out what they should do next to reduce and manage their cyber risk for their industrial control systems. And then the other half of my time I spend, I guess, you could call it content creation. I have an S4 Conference. I have a podcast, article speaking. The tagline we use is to create the future of this industry.

Dale Peterson:

This industry is very conservative, very slow moving, very resistant to change. One of the things that I’m trying to do, probably my biggest thing right now, is to shake that loose. Occasionally, I’ll have an idea that will shake it loose, but what I really care more about is providing a venue and a voice to the people that are trying to drive change industry.

My background is strange, because I stumbled into security and I stumbled into SCADA or industrial control system security. It wasn’t some great plan. I tend to gravitate to what I’m most interested in. So, actually, I was graduated with a degree in finance and was looking at a career as an actuary for a while. I wasn’t miserable, but it wasn’t really terribly exciting. So, you’ll appreciate this.

I read this article about Vernon Walters and thought maybe I’d take the test to try to become a diplomat, because I heard there was a really cool written test and a role playing test. I was a week too late to sign up for that. So, the placement office said, “Hey, there’s this test from this NSA,” which back in 1984, hardly anyone knew about it. It was [inaudible 00:01:17] agency back then. So, I said, “Well, why not? I took the test.” It ended up that I had some skills in the codebreaking, cryptanalysis area. So, I went to work at NSA that way.

Dale Peterson:

At the time, I remember my dad saying, “There’s no future in that. You can work as a cryptanalyst at NSA, but there’s no future for security. What are you going to do with the rest of your life?” But fortunately for me, obviously, by the time I left NSA in the ’90s, there was still small but growing security area in finance, banking very primarily and a little bit in military. And then every decade since then, it’s just growing.

On the SCADA side, where I got into industrial control systems, these systems that run physical things, was basically a product we were developing in the company. I started digital bond with floundering. We needed to make some money. So, we did consulting. This water company out west said, “Hey, can you guys do a cybersecurity assessment on our water SCADA system?” Of course, consultants that are looking for money, we always answer that question with yes. So, I went out there. We did it. We’re actually a tiny little company. For some reason, they selected us over IBM. I was just amazed.

Dale Peterson:

I encourage anyone who hasn’t visited one of these sites that’s monitored and controlled, back control system to go out there and look at it, because it’s just so cool to go out. In this case, seeing pumping plant and dam check gates or going to see a factory. I was just fascinated by that. I really liked working out there as opposed to working in the data center. So, I just found myself gravitating more and more to that. By 2002, doing control system security was all I did. That’s been what I’ve been really focused on for the last 20 years. So, that’s how I got into it.

Bryson Bort:

When you were at the NSA, why did your father discourage you because security had no future?

Dale Peterson:

Well, think about this, this was 1984. So, this was a long time ago. For example, if you were a cybersecurity professional in private industry, there just weren’t jobs. I mean, there was literally no market unless you were, let’s say, going to work for a contractor who had classified contracts to do that type of work for a government agency. Other than that, there was very few jobs. Unless you could predict the future, there wasn’t even looking like there would be jobs. So, he was actually 100% right in his analysis at the time and his guidance. I’m just glad he wasn’t too forceful, because he allowed me to make the decision. But he clearly told me, he thought it was a mistake. It probably was looking at the facts at the time.

Bryson Bort:

So, you mentioned what got you into this was that consultant opportunity to conduct a risk assessment on a water plant. Can you define in your own words, what is an industrial control system and some different applications that we see in critical infrastructure?

Dale Peterson:

Oh, sure. It’s really pretty simple. Industrial control system is a computer system that is monitoring and controlling a physical process. So, in this case, it was a system in that first one that moved water from large reservoirs, damned large reservoirs down to a city, to various cities that then would treat the water and deliver it to their customers. Obviously, electric is the one everyone thinks about, power generation plants, transmission distribution, things of that nature. But it’s really everywhere.

I think some of the fun ones to see are actually in the manufacturing realm. So, we’ve done work in chocolate factories. You actually see a control system run by maybe two or three people in the plant that are actually taking… It’s fascinating to see the automation. It really is impressive, where they’ll take the raw ingredients, mix them up, heat them up, form the chocolate bars, swirl on the nougat, put on the nut, let it cool, wrap it up in its original wrapper, put maybe 12 of them in a box, put boxes in a crate, slap a label o n it, put it on a pallet, and send it to shipping. Literally, the automation allows just that computer system, that control system runs that entire process. You just have a couple of people watching it.

So, I think one of the things that the security community gets a little bit wrong is we look at the really poor level of security in those systems. We think that is somehow reflective of the individuals that created those systems. The amount of intelligence, the amount of attention to detail to make these systems work as they do is tremendous. They just really don’t enjoy the security aspects and really haven’t viewed that as their role for quite a while.

Bryson Bort:

Well, you mentioned earlier that the industry is slow moving and resistant to change. So, one, why is that? Two, how would you sum up the current state of affairs with respect to industrial control systems and security?

Dale Peterson:

Two big questions there. The first one I think, you can really see the resistance to change, because whenever anything new is introduced, the almost knee jerk response by 90% plus of the community is, “It won’t work in control systems.” We heard this for Ethernet for Windows that was back in the ’90s. In the 2000s, antivirus could never work in a control system.

We actually had a research project in the 2006-2008 timeframe funded by DHS and Department of Energy to develop control system detection signatures. So, basically, things that would listen on the network and say, “This might be an attack, raise an alert.” All it did was listen on the network. It didn’t inject anything. Everyone said, “That could never be attached to the network,” even though it didn’t do anything. But now, of course, that’s real as is our antivirus.

More recently, virtualization was considered to be impossible. Well, I can’t say just about everything, but more than half the systems that have been deployed in recent years are virtualized. We’re hearing the same thing about cloud today that it just won’t work, but we’ll see operators replaced by machine learning in the cloud in upcoming years. I guess the reason for it is that the systems are evaluated primarily on their ability to predictably do a process, produce a product, produce a service.

Dale Peterson:

Anything that could change that predictable operation is considered a threat, is considered something that unless there’s some big benefit associated with it, they just don’t want to touch it. There’s still so many like this, but for a long time, there were systems that were put in and then not touched for decades. So, that type of mentality has still stuck. Fortunately, there’s a core… It’s a small group. I don’t know if it’s 2% or 5%. … that are really starting to push the industry. As this little cloistered operations group is being integrated in with the larger technology part of the organization, that’s changing, not as fast as a lot of us would like, but it is changing.

If you want me to tackle the other question, I guess one of the things that is really unique about this that I think it’s funny, we’ve been preaching it since 2012 as loud as we can, but it’s still not really understood widely is that these systems for the most part are insecure by design. When I say that, that’s a term I’ve tried to really drive home for the industry. I’m glad to see it’s being picked up some. It really doesn’t mean that insecure by design is a lack of secure by design.

So, most people can understand in a perfect world, you’d have secured by design. Whatever we developed would have security integrated. From day one, it would be secure. But insecure by design is much worse than a lack of secure by design. What it means is everything that an attacker would want is a documented feature and function of the system. So, an attacker doesn’t have to come in and find some vulnerability to exploit and find a way to cause this damage. Essentially, if they have the engineering skills and the automation skills, they can do whatever they want.

Dale Peterson:

Maybe a great example of that is the Ukraine attack that has gotten so much coverage from the last few years, where the bad guys, the adversary shut off the power to a bunch of distribution substations and caused a blackout in Ukraine. Well, they didn’t actually hack those substations or hack the system to do that. They just used the command shut off. It’s the same thing. The network that was used in that control system had some devices in it. To disable those devices, they uploaded bad firmware. They didn’t need to get an attack to upload that bad firmware. They just used the upload firmware command.

So, most of what you have today in the control system world is this hard exterior hopefully and this very soft interior, where once you gain access to the system, you can do whatever you want limited only by your engineering and automation skills, not by your hacking skills. This is really something that a lot of people have a hard time coming to grips with, because they want to deploy all these good security practice measures that they’re not bad, they’re not wrong, but they’re not really accomplishing anything. Because if the bad guy has gotten to that point, he doesn’t need to look for those missing security controls. He just does whatever he wants to do.

Bryson Bort:

One of the common themes that we come back to repeatedly in this podcast is the fact of community. We talk about industry as a bunch of asset owners who are running all of these elements of critical infrastructure. But then there’s the informal community of experts, who make up part of that 2 to 5% that you described earlier as the folks that are really driving the change. One of the things that you do is you host an annual conference called S4, which I think contributes a lot to that discussion and bringing those folks together. Can you tell us a little bit more about that?

Dale Peterson:

Sure. Happy to, because it’s really where my passion is. We started that conference back in 2007, because I had a researcher named Matt Franz who found a vulnerability in what’s called an ICCP stack. That’s the protocol that the various power utilities use to communicate the status of their generation and the status of the grid. So, if you can think about it, this would be a protocol that essentially would allow something bad to spread out, not just in one utility, but across many utilities. So, it was a pretty serious vulnerability. We talked about, “Well, where can you present that?” You need to present this to an audience that understands control systems and understand security. At the time, it just didn’t exist. There was some of either, but nothing that understood both. So, we created that.

We also had a goal of creating a community, because I had gone through a lot of the same things in the crypto world, where there was basically just a few experts here and there that got together a couple of times a year in groups of 50 or 100. I saw that grow through efforts by Whit Diffie and Ron Rivest and others into this huge community. So, we had community building as a goal. We had a place to present research that people would get as a goal. And then as the event went on, I would say, really, we changed in the mid-2010s, maybe around 2015. We started to really focus on driving change, because awareness and community was succeeding, but we weren’t seeing the change that we wanted to.

So, what I try to do with S4 is I try to find out who’s doing really great work, bleeding edge work, and sometimes even work that I don’t agree with. So, we’ll oftentimes have people with two different solutions that are contradictory on the stage. But the whole idea is to put this idea in front of people, encourage people to think differently. I hate to say think outside the box, but think of ideas that would get the knee jerk reaction that won’t work in ICS and put it in front of a group of people that might be more accepting and might run with the idea. We try to do it very differently. So, it’s not in a hotel ballroom. It’s actually down in Miami, South Beach, a lot of outdoor time around the pool. It’s held in a theater, not in a conference hotel ballroom. So, we try to do everything different.

Dale Peterson:

We actually have live art being created, graffiti artists and all sorts of things going on just to put this forward-looking community in an environment where they can think of and share ideas.

Fortunately, we’ve seen quite a bit come out of it. In fact, some of the companies that are doing well in the industry actually got together first down at the event where people were sitting together and said, “We should do this.” Some of them actually started companies. Some of them started research efforts.Just from the last event, for example, we’ve got an effort to create a top 20 PLC security coding practice list. We’ve got a bunch of people trying to do this ICS4ICS effort to deal with responding to incidents with a coordinated effort. So, we’ve seen a lot of things come out of it, but I still would say we’re at the early stages of really creating this vibrant community of people that want to make change.

Bryson Bort: 

As a part of that, you also put out a periodic assessment of the ICS market, the companies, the tools, and the people that are driving these trends and security. What did we see last year and what do you expect we’ll see this year?

Dale Peterson:

Oh boy, that’s a big question.

Bryson Bort:

I keep stacking them on top, Dale. It’s a Monday, but we got to push hard.

Dale Peterson:

Well, there’s what happened, what I think is going to happen, and what I’d like to see happen. So, 2020 was an interesting year in that it was another year, again, where there were very few cyber incidents in this space. So, as much ink that was written about all these terrible things could happen, when you actually look at the impact, the consequences of cyber-attacks on industrial control systems, the actual impact, thankfully, was minor again this year. It has been every year since the market really started 20 years ago. So, that is the good thing.

The hard part about that is when nothing happened, you run into this, “Oh, it’s never happened to us before” issue. So, we’re still in this high impact, low frequency or you could call it long tail or black swan thing, which makes it difficult to get resources. I think probably the biggest change. I know, this goes out to a lot of government people. I think this is related to some of the things that government can do. But one of the biggest changes we’ve seen last year and even more so in 2019, 2020, everyone’s attention was elsewhere, obviously, dealing with the pandemic.But what we’ve seen over the last couple of years is that this risk that was hidden off in operations. So, operations, the people that run the control system said, “Nothing bad has ever happened. We’ve got it covered. Don’t worry. Leave us alone.” This risk has begun to leak out into the C-levels into the board, where now they are becoming aware of it. As soon as they become aware of it, they start asking questions and then they realize that they have silently accepted some risk. This makes them very nervous, because one of their primary jobs is to manage risk. Here, there’s this potentially large risk out there or certainly a large consequence event that could happen that they haven’t been dealing with. This then frees up all sorts of resources to address it.

Dale Peterson:

Of course, if you’re a senior level manager, you want it solved. So, ideally, they say, “Well, how much money do I have to spend to solve this risk, to address this risk in the next three to six months?” That becomes impractical. You just can’t solve this type of thing that quickly. You can’t absorb that much change that quickly from something that hasn’t been touched for decades to as something where you want it to be in three to six months.

So, what we’ve seen a lot of I would say in the last two years is leadership-driven efforts to try to figure out what they should be doing to address this risk. It’s caused a lot of problems, because the knee jerk again, I’ve used that word a couple of times, but the let’s say, the knee jerk reaction is, “Oh, we need to patch everything,” or “We need to apply these IT good security practices on OT.” It’s not that they’re wrong to do this. It’s never wrong to apply a good practice, but it’s a question of, “How much risk reduction you’re getting in this insecure by design world?”

So, what we’ve actually seen is an increase in funding sometimes for products like the detection products that I cover, as you mentioned, I analyze that market quite a bit, or for applying good security practices without a lot of risk reduction. So, while the attention is great and it’s needed and it needs to grow, we’re still not where we need to be. We’re not necessarily applying those resources very effectively, which is worrisome.

Bryson Bort:

Detection is a relatively new market in ICS that started with passively listening to:

It continues to expand and innovate with other areas that help companies with their security.

Bryson Bort:

I’ve always noted that there are two kinds of companies. It all comes down to the leadership. Either the leadership cares about security or the leadership doesn’t. That’s the biggest indicator, regardless of the technical stack, regardless of the quality of the people there. If leadership doesn’t prioritize it, then you never have effective security. One of the comments you made was about cyber incidents that we saw very few cyber incidents in 2020. Can you be a little bit more specific about what you mean by cyber incidents versus other kinds of incidents that can affect critical infrastructure?

Dale Peterson:

Yes, but let me push back a little on your leadership thing. I don’t want to duck that question, but let me push back on that a little bit. I’m not sure that we should expect our leaders to care about security. This actually ties together with your question. They care about the purpose of the business, the mission of the business, generating the product and service reliably, profitably for the business and for their customers. Security is just one thing that can get in the way of that, right?

So, if the system is hacked and it causes an outage or causes equipment damage or causes low quality output or causes someone to die, which can happen, of course. That’s one of the unique things about these systems. They’re physically large systems. People die in these plants. Not typically yet from cybersecurity incidents, but from other incidents.

So, you have weather incidents. You have labor. You have supply chain issues, not related to security, but just you can’t get the parts. You have equipment failure for a variety of reasons, bad maintenance. These systems go down. This goes back to your question then. When you look at what actually caused outages, what caused any of those impact categories, health and safety, environmental, financial service to customers, any of those issues, the cyber-attack on a control system had just a tiny, minuscule impact compared to any of those other factors I mentioned.

Dale Peterson:

So, if I’m an asset owner, if I’m in leadership, I’m not sure I really have to care about security. What I want my operations people to come to me and say is, “Look, if this happens, if this attack happens, these are the consequences. We need to understand that this is possible. Is this level of risk acceptable for us?”

I’m just working on my weekly article this year, my first one for 2021. What I’m really hoping we see in 2021 is a shift or at least an increased emphasis on recovery and resilience, because we’re never going to get this possibility of attack, the likelihood of attack down to zero. We can employ as many security controls as you want. It’s already that likelihood is already very, very small just based on statistical data. Now, we may be getting lucky that people just aren’t trying to do it, but right now, the likelihood is very small. But the consequence can be so large that that risk is still unacceptable.

So, there’s some efforts through like Idaho National Labs has a program they call CCE. There’s other programs. These are all acronyms like Cyber PHA and such. But all these things are doing is they’re saying, “Let’s look at the consequence side. What if the bad guys get in and they get on a system that gives them everything they would want to do on that system? How can we make it such that the consequence of that attack, that successful attack is at a level that’s acceptable to the company and the community and perhaps the nation?”

Dale Peterson:

So, I’m really hoping that that’s what management focuses on, not so much, “Do we have this security control or this security control or this best practice?” But in the end of the day, is the risk of a cyber-attack and the potential consequences related to that something that we can live with as a company and the people we serve can live with as well?

Bryson Bort:

So, taking it up to an even bigger picture, what do you think are the top cybersecurity challenges for the country as it relates to critical infrastructure today?

Dale Peterson:

Well, I think what I said is the biggest one is I don’t think that we really are in a position to recover. Ideally, you’d like to be resilient. You’d like to be attacked and not cause any issue. So, we’d like to have resilience, but realistically, you’re never going to have perfect resilience. So, I think the biggest issue right now is if something bad happens, do we have an effective proven plan to recover to get whatever those key services are either for the business purpose or for the nation when you’re talking about things like water, power, pipelines, things of that nature, transportation systems? Can we get back up and running if the bad guys can do everything that’s possible when they get on that control system?

Bryson Bort:

Public-private considerations, what do you think the relationship between government and the private sector is in keeping us safe or helping to improve that resilience?

Dale Peterson:

Well, it’s probably come out on the podcast a lot. A lot of the critical infrastructure in the US is a little different because it’s privately owned. That means that the government can’t, unless they put regulatory things in place, mandate certain levels of controls. Now, they’ve done that in the electric sector to some degree with, let’s say, charitably, various levels of success. I think, the amount of security and the amount of risk reduction we’ve gotten for the amount of resources applied to it has been very inefficient, but it has improved. Certainly, the bar has been raised. So, you could do it that way.

But what I’ve seen, actually, is that an enlightened private industry is asthma the most beneficial, the best way to actually address this problem. As I mentioned before, as the C-levels, as the board, as the CEO, the COO understand the risk, they don’t want to take on high risk events that would worry the nation, because it’s bad for their business as well. So, as they get more enlightened about that, they can address it their own way. In fact, we haven’t seen a tremendous amount of success when government has spent money to try to do a variety of things. In fact, private industry is almost always doing it better.

Part of me thinks the government should just focus on the government control systems, which there are a lot of. They have control of those. They should make sure they’re doing a good job securing those that are resilient and can be recovered. I had some pretty tough interviews. I interviewed Chris Krebs back in August about what’s actually been accomplished and such. I mean, we can go into areas where I haven’t think they’ve done a good job. But I think what they can really do, my advice to the government would be to do less and be louder.

Bryson Bort:

Chris Krebs most recently came to fame as the Director of CISA that advocated for election security and was fired by Trump over tweet. Less well known is his active involvement in driving public-private partnership for ICS. Full disclosure: yours truly was brought on by him to CISA as a Strategic Advisor last year and the Agency has a formal MOU with the ICS Village non-profit.

Dale Peterson:

Rather than trying to say, “Here’s the 250 things you should be doing and should be considering,” I would target my message almost exclusively towards the C-levels and be saying, “Do you understand this? Are you addressing this risk?” It could be a message every quarter. It could be a message more frequently. You could have supporting materials for the staff. There’s nothing wrong with that, but there’s plenty of those out there.

But for example, every manager or every leadership organization should understand what their recovery time objective is. If they can meet it, if this key part, the reason the company exists goes away, what will they do? Do they have a plan for that? When the leadership starts asking those questions, they tend to get very nervous at the answers, because the answers are, “Oh, that won’t happen. It’s never happened before.” And then they dig in deeper and say, “But what if it does happen?” And then they get very nervous. So, getting the right message to the leadership would be important.

Another way that I’ve seen that I believe could be effective is you see, for example, the SEC, where they require certain reporting on things. The SEC is increasingly requiring reporting, which means executive awareness on cybersecurity issues. I see some value there, but on a lot of the other things that the government is trying to do, quite frankly, it’s being done better outside. The thing that the government has, particularly DHS CISA, that others don’t is the big megaphone. So, I even said this to Chris, the thing is you need to use that big megaphone much more effectively. I don’t think that’s necessarily a ton more volume. I think it’s being more selective and louder on the message you want to send out.

Bryson Bort:

If you could wave a magic, not internet connected, of course, wand, what is one thing you would change instantly?

Dale Peterson:

Boy, there’s so many, one thing. Because I think it’s something that’s fundamental and is holding us back, I would address that insecure by design problem. Now, we finally have some systems that actually have authentication. So, if you want to send a command, if you want to tell this factory or this power plant to do something, it can actually authenticate that yes, this command came from someone we trust. Yes, this command hasn’t been changed in transit. Yes, this firmware is actually issued by the company. It isn’t some bad stuff like we’ve seen with solar winds. Although, I guess if you get in the supply chain, that doesn’t work. But this insecure by design problem we have where all the systems out there or 99% of the systems out there right now, if you get access, you can do whatever you want.

If I can wave my magic wand, I’d say everyone needs to replace those systems in the next three to five years. A lot of times people say that’s impossible, that’s too expensive, but the percentage of the cost of those systems compared to the underlying physical infrastructure they monitor and control is tiny. So, it really is more just a question of will and attention than it is of possibility or resources.

Bryson Bort:

All right. So, you waved your magic wand. Now, looking into the crystal ball for a five-year prediction, what is one good thing and one bad thing that you think is going to happen?

Dale Peterson:

I think one bad thing that’s going to happen is you are going to see the criminals figure out how to make money by exploiting these systems. It’s not that they can’t do it now. It’s just that there’s easier ways to make money. But one of the reasons why the consequences have been so low is that the criminal element beyond working for nation states who are willing to take the risk of getting caught and reprisals. People are not doing this for money. Sometimes ransomware will get on a control system just because they’re not segmented, they’re not separated properly. But right now, there’s been very little criminal attention paid to control systems. I just don’t see that lasting for another five years. So, that would be the bad thing that’s going to happen.

The positive thing that’s going to happen is you hear this buzzword, I’m not a big fan of it, but IT/OT conversion. So, IT, your enterprise network, OT, your operations technology, the systems running the controls in the plant or whatever you’re monitoring and controlling. There is an integration of these, because they’re using the same types of technology. That’s starting to happen, but that’s going to be a huge plus. Because once they combine, then some of these things that we’ve carried along with us like not touching a system for 20 years or running a system that’s very fragile that if you have to reboot it, it’s a disaster, all that’s going to go away. We’re already seeing that.

We’re seeing that with virtualization of the computers. We’re going to see virtualization of other things so that we can spin them up faster. We’re going to see more of this get into the cloud where it can be maintained better. We’re going to see all sorts of things. We’ve seen a slow leaking of things used in the enterprise or in IT coming into OT, but now it’s really not going to be separate. It’s going to bring skill sets. It’s going to bring technology. It’s going to be bringing faster pace of change. It’s going to bring more reliability and more efficient and more secure control systems to the world. So, that’s going to be good. It’s going to be fought, but the battle’s already done. It’s just a question of, “How long before the results are seen?”

Bryson Bort:

All right, this is a grab bag. If there’s any last points or anything you’d like to make or discuss.

Dale Peterson:

I think the key for me is I would really if I could… You’ve got a Washington audience that listens to this and people that interact with Washington. I’ve interviewed ICS leaders in the government for over a decade now. I had Marty Edwards on stage. I’ve interviewed Chris Krebs. I’ve interviewed a variety of other people in DOE. One of the things that I always pushing them for is metrics related to what they’re trying to do. So, they have these goals. They have these programs they’re going to try to do. I would really like to see metrics related to those. Not just counting metrics, I don’t care how many vulnerability alerts they’ve put out or I don’t really care how many companies they’ve assessed, because it’s always going to be such a tiny fraction.

I would like to see them really think about, “What are our goals? What are we trying to achieve with this effort we’re doing in the government? How are we going to measure success? How are we going to report that out to the industry?” I’ve been asking for that, as I said, for more than 10 years. I get very nice, very friendly people giving me non-answers, but I’d really like to see that. I didn’t really see that in any of the NDAA Solarium stuff. It’s probably not belonging there, but it certainly would be belonging in the plans that the various agencies and departments have internally.

Bryson Bort:

I think we’re still trying to establish, “What’s the government’s mission?” I mean, as you were talking about it, it really is confined to more of a megaphone and a way to prioritize certain efforts, but the scope of the problem is larger than the collective agencies across the SSAs can really tackle.

Dale Peterson:

It’s funny, because it’s almost like with S4, I don’t really care if I agree with the research or the idea. I just want to put it out there. In a sense, I almost at this point don’t care what the goal is that they have. I would just like to see, “Okay, if they’re going to go after this, how are they going to measure success?” When I interviewed Chris Krebs back in August, I said, “What’s the most important thing that you’re doing that you’re going to focus on with the private sector industrial control system community?” He said, “It’s this working group, this ICSJWG. It’s a working group.” This effort that they do happens twice a year regionally and about 200 people attend.

So, if that’s really your most important thing, then you’d want to multiply that by 10 or 100. You want to find ways of getting it out there and getting more people involved to build the community, if that’s truly what you believe. I’m not saying that should be their goal, but I’d really like to see them come up with a goal, whatever it is, and just show some progress towards achieving that goal, show a plan, show how they’re measuring it.

But I agree with you, it’s a really hard job. I probably would last three months in there, because I’d make too many people angry. Because you really have to think about of the hundreds of things you can do, “What are the most important. Which sectors are you going to pay less attention to? Which smaller organizations are going to be left on their own?” It’s a tough job, but it really is a part of risk management from a nation state type of view.

Bryson Bort:

Well, that is why it’s called critical infrastructure.

Dale Peterson:

Yup.

Bryson Bort:

Well, Dale, really appreciate having you on the podcast today. I think we covered a great range of different aspects.

Dale Peterson:

Oh, my pleasure. I hope everyone has a great 2021. As always, we’re making progress. It’s never as fast as we’d like, but we’ll keep working on it and appreciate everyone who’s trying to make things better.

 

Featured Publications