“In our lifetime for the next year 10 or 20 years, artificial intelligence is not about a rebellion of the robots (a-la the Terminator). It is an industrial revolution. It’s a rewiring of business, military, the economy, our society with AI and automation in all its various forms…In these scenarios, AI is not just about prediction, it’s also about influence.”

This week, the second half of our conversation with author and strategist P.W. Singer on Hack the Plant, a cybersecurity podcast produced by the R Street Institute and ICS Village. He discusses about his latest book – Burn In – where he translates real-world research about Artificial Intelligence into a glimpse at a future we’re not too far away from if things go wrong and we do not protect ourselves. (Listen to the first episode with P.W. about the future of war here.)

(Subscribe to Hack the Plant on Spotify or Apple, by RSS feed or search for it wherever you listen to podcasts.)

Transcript:

Joshua Corman:

Our dependence on connected technology is growing faster than our ability to secure it, especially in areas affecting public safety and human life.

Bryson Bort:

I’m Bryson Bort. And this is Hack the Plant. Electricity, finance, transportation, our water supply. We take these critical infrastructure systems for granted, but they’re all becoming increasingly dependent on the internet to function. Every day I ask and look for answers to the questions. Does our connectivity leave us more vulnerable to attacks by our enemies? I’m a senior fellow at the R street Institute and the co-founder of the nonprofit ICS Village, educating people on critical infrastructure security with hands-on examples, not just nerd stuff. I founded GRIMM in 2013, a consultancy that works the front lines of these problems every day for clients all over the world.

P.W. Singer:

It’s playing out in Israel right now, where hackers have been going after Israeli water systems. Again, not to steal information from them, but to change the setting on the chemicals in Israeli water.

Bryson Bort:

Each month, I’m going to walk you through my world of hackers, insiders and government working on the front lines of cybersecurity and public safety to protect the systems you rely upon every day.

P.W. Singer:

If you think that the small town water authorities and the mom and pop size companies have better cybersecurity in the US than the Israelis do, I’ve really, really bad news for you.

Bryson Bort:

An attack on our critical infrastructure, the degradation to the point that they can no longer support us, means that we go back to the Stone Age, literally overnight.

Joshua Corman:

If we think the government’s going to solve it for us, we’re wrong. We have to help them.

Bryson Bort:

This is not a podcast for the faint of heart. If you want to meet those protecting the world and what problems keep them up at night, then this is the podcast for you.

Bryson Bort:

In our first episode, we talked with P.W. Singer about how he approaches cybersecurity and what keeps him up at night. What’s different about P.W. is he turns his expertise and policy insights into fiction books that everyone can understand and brings them into the conversation. The last time we talked to him, you learned about his first techno thriller, Ghost Fleet. It looked at the future of war and what he learned from writing it.

P.W. Singer:

It actually had greater policy impact than my non-fiction work. In my non-fiction work I was doing Fortune 500 consulting and it was on the military reading list, but Ghost Fleet was the one that got me invited to the White House situation room. It was the one that the Navy literally named a $3.6 billion program after it. Called it Ghost Fleet. And it had greater impact. There’s no other way to cut at it.

Bryson Bort:

This episode, you’ll learn about his latest book Burn-In, where he translates real-world research about artificial intelligence into a glimpse at a future that we’re not too far away from if things go wrong, if we do not protect ourselves. And that’s where our conversation picks back up.

P.W. Singer:

AI is… It’s a game changer and so many different ways of discussing it. Numerically, we can show it, just simply put, 91% of leaders say that AI is the most important technology out there. It’s the game changer for them and their organization. And you see this at everything from the new US National Defense Strategy to the business plans of everything from tech companies to John Deere tractors. So 91% of leaders say AI is the most important thing out there. Seventeen percent say, “I even understand the basics of it, let alone its applications or its dilemmas.” So you have this massive disconnect. And so what we try to do with Burn-In is… It’s a techno-thriller. It’s a story of a hunt for a cyberterrorist through Washington, D.C. of the future. But baked into the story are over 300 nonfiction explanations and predictions of everything from how does AI work? To what are the key cybersecurity vulnerabilities out there that a bad guy might go after? To what are the legal ethical dilemmas that we’re going to have to figure out everywhere from our streets to in parenting?

P.W. Singer:

And it’s not just these 300 are baked into the story. There’s 27 pages of endnotes to document how it’s not us dreaming them up. Here’s the reference, the nonfiction reference to show that it’s real. I’ve joked, I’m a parent… It’s like sneaking veggies into a morning smoothie for your kids. Just in this case, it’s for the public and policymakers. It’s to carry across information in a manner they’re more likely to read and more likely to digest and act upon than your boring white paper or standard PowerPoint. And by the way, I’ve got the hard data to prove that it’s more likely to be read, more likely to be acted upon, but hopefully you just enjoy it.

Bryson Bort:

So let’s take this opportunity. What is artificial intelligence? Why exactly is it a threat? If this isn’t the Terminator coming out to get us, that’s not happening. So what is really the threat and what can we do about it?

P.W. Singer:

Oh, wow. The subtitle of the book says it all. Burn-in is the title. The subtitle is a Novel of the Real Robotic Revolution and you hit it exactly. Here again, why the book? We’re on the 100 year anniversary of the creation of the word robot. It was actually created in 1920 for what we would call a science fiction. It was a play, R.U.R., and in it the writer was looking for a word to describe this new character he’d come up of mechanical servants who become intelligent. They wise up and then they revolt against their human masters. They rise up. And ever since that notion of robot as the kill all humans… And it’s derived from the Czech word for servitude, for slavery… So ever since, that has been the way that we’ve talked about robotics and AI.

P.W. Singer:

And to your question of “What’s AI?”, there’s actually a footnote in Burn-In that takes you to… It’s a incredibly contested term. There’s an academic study that looked at it, just the various termings of it, not how to term it, but found that experts have over 40 different definitions that they argue back and forth of the best definition of AI, of artificial intelligence. So, however I answer you, there’s going to be 39 other groups’ definitions out there. They go, “That’s the wrong way.” But basically we’re talking about a machine of some kind that is simulating or surpassing human decision-making. And whether it’s doing it at speed or pulling in greater data, and we could go on and on about this. I don’t want to cut… It’s one of these debates that the monks would have on how many angels could fit on top of a pin?

P.W. Singer:

To go back to your main, core question, the issue is not the science fiction of robot revolt. And that discourse that stuck through all our sci-fi, the Terminators, The Matrix, 2001, HAL… That’d be fine if it stayed in science fiction, but it actually shapes the way we talk, think and act about them in the real world, including the policy world. For example, the killer robots debate has taken off everywhere from the human rights world, to it’s an issue that the Pentagon is constantly dealing with, to it was literally the substance of a debate at the United Nations. Foundation and individual donor world, sum total, have given almost $6 billion to research the existential threat from robots. The hard reality… Here again, you notice how I keep going back to this in reference… The reality is maybe one day you and I are going to have to figure out, do we fight or salute our metal masters? Maybe. But in your and my lifetime for the next year, 10 years, 20 years, the issue is not a rebellion of the robots.

P.W. Singer:

It’s a industrial revolution. It’s a rewiring of business, military, the economy, our society with AI and automation and all its various forms. And from that, you have a series of questions that you have to figure out, and it touches everything from job displacement and replacement. There are a wide variety of professions that are dealing… Whether it’s a factory worker to contract lawyer to… I was recently did a discussion with a group of Wall Street traders. Same thing in medicine, military, you name it, they’re all wrestling with, “Okay, what are the roles that can be automated or not? And the roles that can’t be automated, what’s the mix of human and machine together?” And that of course has massive ripple effects on the economy and the people that are automated. You can go on and on about that.

P.W. Singer:

Second issue, you have a series of legal and ethical questions that we’ve never dealt with before. Machine permissibility, machine accountability. And then third issue, the substance of what you and I have been talking about in this, all sorts of security questions that take us to places we’ve never been before. One is the essence of what does it mean to wire up the economy and to networks that are increasingly vulnerable? So yes, you get smart thermostats in your home, smart cars, smart cities, smart military bases. That also means you get hackable thermostat, hackable home, hackable city, hackable military base. And, you get the physical consequences of it because a machine, when it’s even an intelligent one, is directed to do something because you change the sensor information, you change the instructing protocol, it doesn’t notice that things are off. It’s more likely to continue to carry out that action. So you’ve got that physical consequence, which is what we explore in Burn-In, but you also have the other element of it that’s a huge need, which is how it changes the nature of privacy.

P.W. Singer:

And to put it bluntly, every one of those things that’s out there, whether it is that smart thermostat, whether it is that app on your phone, whether it is that smart building itself, they are collecting information. They’re doing it in very evident ways. The cameras and face recognition. They’re doing it in ways that maybe not be evident. Your clicks, your buys, your physical movements, your temperature, all of that information is then… The key to it is that it’s not just being collected, now it creates a massive dataset.

P.W. Singer:

So think face recognition technology, something that’s been rolled out by everything from police forces to Rite Aid, the drug store. Face recognition is it matches Peter’s face to an identity, but the key is then it then takes that identity and matches it to all the data that we’ve collected on him. Everywhere he’s been, everything he’s bought, everything that his family has bought, everything he’s posted. And so we take that data and now we know something about them. But when you add in artificial intelligence, it’s not just your history and that loss of privacy, it’s also now about prediction. Based on all this, what is he going to do next? Where is he going to go next? Is he maybe going to commit a crime next or not? How might he vote next? And then with AI, it’s not just about prediction, it’s about influence. How can we influence what they’re going to buy? How can we influence who they’re going to vote, or whether they’re going to vote or not? And so these are the new issues that emerge with AI and automation and they, again, are of a level that is industrial revolution in scale.

Bryson Bort:

I liked how you defined artificial intelligence. Less by going into the technical definition of what it is and more to the relevance of it. The relevance of artificial intelligence is that someday computers will be able to surpass human capability. In the meantime, we slowly and fractionally achieve pieces of that performance until suddenly we’re going to wake up and it’d be some substantial amount. But that’s the future.

P.W. Singer:

Let me pause here. Now we are getting into kind of the wonky definitions of it. Because how you framed it is what sometimes called general AI, which is a system that is able to… In the way a human can move across certain areas. So there’s things that we do that are highly intelligent. Mathematical calculations but also walking down the street. And general AI is the notion that you can bring all of that together. We’re not there yet. That is, as you put it, that’s off in the future. Now, how far off it is, is it 10, is it 20 years?

P.W. Singer:

But we are already seeing the application of AI in the individual formats, whether it’s the field of medicine using it to… Right now, think about coronavirus. AI is being used for everything from looking at patients’ lungs to help identify the optimal treatment for them in a way that the human doctors would not be able to do on their own, to one of the more interesting uses of it is AI is going back and reading all of the old journal articles, because there’s literally tens of thousands of journal articles, drug studies, you name it, to try and find, is there something in the past that might be useful to coronavirus? So it’s not an AI that is simultaneously able to do that and play chess at the same time, which is something else it could do, but it can do it individually.

P.W. Singer:

What I’m trying to get at is now we sort of unpacked the definitions of it in kind of a wonky manner, we should not think of AI as off in the distance future. It is already changing our world today. To automation we say, “Well, one day it might change our economy.” Well, of people in manufacturing, over 80% of the job loss in the last generation was due to automation, the way that factory assembly lines have changed. Some people want to blame, “Oh, those gosh, darn foreigners took our jobs.” Sorry. The hard data shows that the vast majority of it was automation and so your policy needs to be dealing with that rather than the other. But then now, we put our cybersecurity hats on. Everything that we just talked about, whether it’s using it for medical research, whether it’s automating the assembly line, in the here and right now that all opens up vulnerabilities that a bad guy might go after. And here again, we’ve seen this. We’ve seen targeting of automated assembly lines. We’ve seen targeting of medical research on coronavirus.

Bryson Bort:

We’re going to stop a minute because my next question to P.W. is about what worries you him most about tying our critical infrastructure to the internet. You heard his worries about the internet in the first episode, but I just want to remind everyone where he stands.

P.W. Singer:

What’s playing out is that the internet itself and how we use it is changing and we have to ensure that our security keeps up with that. And so, if you think about the first generations of the internet, it’s about communication between people. Initially, just scientists then it’s the rest of us. And along the way, all of the cybersecurity issues surfaced and primarily around the theft of information. Be it the theft of your credit card, be it the theft of intellectual property to build your own version of that jet fighter if you’re China, that’s what’s been playing out and it’s been challenging enough.

P.W. Singer:

Now we are essentially wiring up what’s called the Internet of Things and the Internet of Things involves everything from your smart home to smart cars to smart cities to all the various forms of critical infrastructure that’s out there. And one of the misnomers of this is that people just think about it as like, ‘Oh, the power grid might go down.” No, no, no, no. It’s much more than that. It’s everything from water treatment plants to transportation networks, you name it. And unfortunately, we’re, in short, repeating all of the mistakes that we made previously. We’re not baking security into the emergent Internet of Things, including critical infrastructure, in the way that we need to. And there’s a variety of reasons behind that. It’s a lack of regulation. It’s a problem of the balancing between convenience and security. It’s cost savings. You name it, but that’s what’s playing out.

Bryson Bort:

It’s the classic chicken and egg problem. Function always comes first. Security always come second. But then what and who who’s first? I don’t buy a car because it’s secure. I buy it to get from point A to point B or the emotional appeal of a hot car. And this is our problem here. We’ve connected out of convenience and it’s now really inconvenient, secure, when we look at the capital cost for replacement, the inability to outright fix vulnerabilities and the length of the manufacturing cycle to remediate. On a future episode, we will be joined by a chief product security officer of a large manufacturer for more on this conversation.

Bryson Bort:

What is the number one threat that you see now to critical infrastructure from a national security perspective? What keeps you up at night, P.W.?

P.W. Singer:

How we are repeating almost all the mistakes of the first wave of use of internet in our new versions of the internet. We are not baking security in from the design side, all the way to how it is deployed. And in many areas, there is a absence of good government guidance standards and guess what? In some situations you do need regulation. That is the hard reality of the way that all industry in America works. Is that it is not self policed. We do not treat industry the way we do golf matches among friends. Even professional golfing has someone policing it. So that’s in the midstream of it and then all the way down to the user side, the company that’s deploying it, the consumer side, security is not being baked into… It’s even the simple things of, “Hey, you actually have to be required to change your password” to how we educate around it.

P.W. Singer:

To put it bluntly, I worry we’re recreating many of the mistakes and then that means our approach will probably follow where we will wait for the bad thing to happen, and then we will act. And that scenario, again, will play out whether it’s for the individual company, “I could have put these security procedures into play for a minimal amount of money and instead I waited for our systems to be hacked and only then did we put it in,” to how we think about it as a nation. That’s my concern. This will not be a, “How could we have known?” scenario. This will not be a 9/11 commission saying, “You know, it was a failure of imagination. We could not have contemplated this happened.” What will happen is something that people will have talked about, warned about for years, something that minimal measures could have prevented.

P.W. Singer:

And that’s what worries me. That’s, again, what we’ve tried to play out in the Burn-In book is we hopefully see it as not an act of prediction, but prevention. By exploring scenarios of the bad things happening, you can entertain people and hopefully people will enjoy it and find entertaining, but also hopefully it’ll prompt that action of, “I did not like seeing that play out in the story. What can we do to prevent that? Oh, actually it’s not that hard? Huh. Let’s do it.” And again, that’s the level of everything from, hopefully, the governmental policy maker, all the way down to someone deploying a certain system in your home. That smart thermostat, that app for your kid. Hopefully it prompts that thought of, “Okay, what can I do to prevent the bad thing from coming true?”

Bryson Bort:

You talked about we could use some government regulation. What kinds of policy changes specifically need to protect our country now? What veggies should we be eating?

P.W. Singer:

I think a wonderful starting point for this and important because it’s both bipartisan and doable is the recommendations of the Cyberspace Solarium Commission.

Bryson Bort:

During the Cold War, The United States faced a strategic challenge where risk threatened to rapidly outpace the country’s ability to respond and counter it. The Soviet Union presented a superpower that needed to be addressed quickly. President Dwight Eisenhower convened Project Solarium, with senior government officials and outside experts to develop approaches to counter the Soviet threat. This new Cyberspace Solarium Commission draws inspiration from this historical legacy. Established by the 2019 National Defense Authorization Act, it’s a bipartisan, intergovernmental and multi-sector body charged with evaluating divergent approaches to defending The United States in cyberspace, in driving consensus toward a comprehensive strategy to shape behavior, deny benefits and impose costs on our adversaries. This report can be found at solarium.gov.

P.W. Singer:

There’s all sorts of things that I would love to happen that, frankly, aren’t going to happen in our political environment. The commission report is a wonderful starting point. It’s a group of experts in cybersecurity, but also members of Congress saying, “Here’s a set of things that we can put into place that would make a difference. Would not end all problems, but would make a difference. And, we think they are politically possible.” And so, I would steer people to that report and say, “You know what? This is a great starting point.” And again, if I recall, there’s something like 80 different recommendations and some people might want to argue back and forth like, “I like this one but not…” But it is 80 different things that we could do to end up in a much, much better place.

P.W. Singer:

So here again, I’m saying it’s not mystery. It’s there. Now, what frustrates the hell out of me is when you see partisan members, and this includes [inaudible 00:26:43] members, or the Trump administration, it takes a bipartisan thing where Democrats and Republicans came together and said, “We collectively agree on this” and then they’re rejecting it because it’s not just an R thing. Or they’re rejecting it because, “Well, it maybe makes us look bad because it makes it look like the President’s not doing enough on cybersecurity.” Or, “Ooh, maybe it reminds people of all the things that played out in 2016. And we don’t like that.” Or, “Maybe it might offend Russia.” Sorry, this is a bipartisan set of recommendations, so it frustrates me when it is put into that lens. As you and I are speaking right now, there is a limited set of those recommendations that appear to be implemented because of fear of offending the administration. That’s frustrating.

P.W. Singer:

On the cybersecurity side, a bit of a frustration is sometimes you’ll hear experts in the field say, “You know, well, but that won’t solve all the problems.” Or, “This bad scenario might still happen.” I guess the way to make a parallel is too often experts in the cybersecurity field sound like doctors saying, “You know what? It’s of no use to cover your mouth when you cough because that does nothing to stop breast cancer.” What I’m getting at is that they will say, “You know what? It’s no use trying to get the basics right because there is this high-level threat that might still get through.” No, let’s at least get the basics right. Yes. This high level might, so let’s at least do some of the basics so that we’re not spending so much time on the low-hanging fruit in this space.

P.W. Singer:

So those are my two frustrations. You asked me what could we do. My thing is, we know what we can do. There’s a list of what we can do. There’s things we know that can be put into place that would make it a much better situation. It’s instead we have to go after why is it that people are rejecting it and too often it’s these other alternative reasons of raw partisanship or it’s strange things going on in national security today, or it’s a focus on the most extreme threat scenarios versus closing up low hanging fruit. So again, I think that’s how we have to think about it.

Bryson Bort:

Let’s put partisanship politics and reality aside. If you could wave a magic, non-internet connected wand, what is one thing you would change on infrastructure itself or policy?

P.W. Singer:

We would have a entirely new critical infrastructure that is designed around the concept of an escalator. What do I mean by that? An escalator is a wonderfully designed system in that it is a vast improvement over what it is replacing, stairs. However, if it fails, it just goes back to the level before. It’s still workable. It’s still useful. So, if we could have critical infrastructure that is like escalators, where you get all the wonderful, new, awesome improvements, but if it fails for any reason, it still works. It’s not catastrophic. That’s what I’d love to have. That’s what my magic wand would put into place.

Bryson Bort:

All right, P.W., you waved your magic wand. Now we’re going to look into the crystal ball. Five-year prediction. One good thing and one bad thing.

P.W. Singer:

One good thing. We have started to implement all the various positive sides of the Internet of Things, whether it’s more smart cars, smart buildings, you name it, and they have started to make an appreciable difference in both people’s lives… You get all of this cool, amazing convenience. Your life’s improved by it. There’s monetary savings for you, for your business. They’ve even started to make a difference in things like reducing power consumption and helping with things like climate change. So that’s the positive scenario of all of this.

P.W. Singer:

The negative scenario is it’s leading to a massive level of privacy loss, influencing by other actors, be they government, be they private sector and it’s even opening up vulnerability. So we are seeing catastrophic-style attacks. We’re seeing the Burn-In scenario of someone recreating a biblical plague-type attack through cyber means.

P.W. Singer:

So you get the good, but you also get the bad with it, and that, frankly, is what’s always happened with technology. The very first technology was a stone that someone picked up and they either used it to build something or they used it to bash someone in the head. It’s the same thing with the internet. That’s how I approach it, is that there’s wonderful, awesome possibilities, let’s grasp toward those, but let’s also be aware that there are bad outcomes that can happen. So, let’s go into it with our eyes wide open and try and steer it more towards the good outcomes and try and reduce the costs of the bad ones. Try and reduce the likelihood of the bad ones.

Bryson Bort:

P.W., thank you for joining us today. Really appreciated all of your stories and insights.

P.W. Singer:

All right. Thank you so much for having me.

Bryson Bort:

Thank you for listening to Hack the Plant, a podcast of the R Street Institute and ICS Village nonprofit. Subscribe to the podcast and share it with your friends. Even better, rate and review us on Apple podcasts so we can reach even more listeners. Tell us what you thought about it and who we should interview next by finding us on Twitter @RSI or @ICS_village. Finally, you want to know more about R Street or ICS Village? Visit our rstreet.org or icsvillage.com. I’m your host Bryson Bort. Thank you to Executive Producer Tyler Lowe of Phaedo Creative, Creative Producer William Gray and Editor Dominick Sterett of Sterett Production.

 

Featured Publications