ITSPmagazine Podcast Network

From Secure Foundations to Resilient Futures: The UK's Digital Security by Design Initiative | An Infosecurity Europe 2024 Conversation with Professor John Goodacre | On Location Coverage with Sean Martin and Marco Ciappelli

Episode Summary

In this On Location podcast, Sean Martin engages in a thought-provoking conversation with Professor John Goodacre, a tech innovator and government program director. Explore the transformative journey towards secure-by-design technology, as Professor Goodacre navigates the complex realm of cyber resilience and unveils groundbreaking strategies to fortify digital infrastructures against ever-evolving threats.

Episode Notes

Guest: Professor John Goodacre, Director Digital Security by Design, University of Manchester, UKRI [@UKRI_News]

On LinkedIn | https://www.linkedin.com/in/john-goodacre-722b59/

____________________________

Hosts: 

Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/sean-martin

Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli

____________________________

Episode Notes

In this episode of the "On Location with Sean and Marco" podcast, host Sean Martin flies solo to engage in a riveting conversation with Professor John Goodacre, Director of a UK government program and a renowned figure in the tech industry. Professor Goodacre sheds light on his diverse career journey, spanning from telecoms to supercomputers, with a key focus on cyber resilience and system integrity.

Emphasizing the need for a holistic approach beyond patching vulnerabilities, Professor Goodacre discusses the inception of the digital security by design program in 2019. He delves into the program's aim to revolutionize technology foundations, collaborating with industry giants like Microsoft and Google to enhance digital infrastructures globally.

The conversation explores the significance of memory safety in software, highlighting the ongoing battle against cyber threats and the necessity for robust security measures at the hardware and software levels. Professor Goodacre's insights underscore the imperative shift towards secure by design and default practices to combat evolving cybersecurity challenges effectively.

Furthermore, the episode touches upon the collaboration between academia, businesses, and governments to implement secure frameworks and educate stakeholders on the importance of cybersecurity. Professor Goodacre advocates for a proactive approach, stressing the economic benefits and risk mitigation associated with investing in secure technologies and practices.

Listeners are left with a deepened understanding of the crucial role memory safety, compartmentalization, and secure design play in fortifying digital ecosystems against cyber threats. Professor Goodacre's illuminating discussion paves the way for a paradigm shift in cybersecurity strategies, fostering resilience and integrity in the digital landscape.

Top Questions Addressed

Be sure to follow our Coverage Journey and subscribe to our podcasts!

____________________________

Follow our InfoSecurity Europe 2024 coverage: https://www.itspmagazine.com/infosecurity-europe-2024-infosec-london-cybersecurity-event-coverage

 Smashing the Stack; All Good Things | Exploring Software Lifecycles from Secure By Design to End of Life | An RSA Conference 2024 Conversation with Allan Friedman and Bob Lord | On Location Coverage with Sean Martin and Marco Ciappelli: https://redefining-cybersecurity.simplecast.com/episodes/smashing-the-stack-all-good-things-exploring-software-lifecycles-from-secure-by-design-to-end-of-life-an-rsa-conference-2024-conversation-with-allan-friedman-and-bob-lord-on-location-coverage-with-sean-martin-and-marco-ciappelli

On YouTube: 📺 https://www.youtube.com/playlist?list=PLnYu0psdcllTcLEF2H9r2svIRrI1P4Qkr

Be sure to share and subscribe!

____________________________

Resources

Progress for the DSbD Initiative and CHERI Capability Hardware: https://www.infosecurityeurope.com/en-gb/conference-programme/session-details.3783.219352.progress-for-the-dsbd-initiative-and-cheri-capability-hardware.html

Learn more about InfoSecurity Europe 2024: https://itspm.ag/iseu24reg

____________________________

Catch all of our event coverage: https://www.itspmagazine.com/technology-cybersecurity-society-humanity-conference-and-event-coverage

To see and hear more Redefining CyberSecurity content on ITSPmagazine, visit: https://www.itspmagazine.com/redefining-cybersecurity-podcast

To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast

Are you interested in sponsoring our event coverage with an ad placement in the podcast?

Learn More 👉 https://itspm.ag/podadplc

Want to tell your Brand Story as part of our event coverage?

Learn More 👉 https://itspm.ag/evtcovbrf

Episode Transcription

From Secure Foundations to Resilient Futures: The UK's Digital Security by Design Initiative | An Infosecurity Europe 2024 Conversation with Professor John Goodacre | On Location Coverage with Sean Martin and Marco Ciappelli

Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording, as errors may exist. At this time, we provide it “as it is,” and we hope it can be helpful for our audience.

_________________________________________

Sean Martin: [00:00:00] And hello everybody. You're very welcome to a new on location episode. This is Sean Martin and Marco and I, even though he's not here, we're, we're on our way to InfoSecurity London and InfoSecurity Europe in London, I should say, and we're excited to hear all kinds of things, obviously cyber related to business policy and governance and, and, and public and private sector privacy. 
 

I'm sure we'll touch on there as well. All in the. Supportive, enabling secure business, uh, protecting customers along the way. And, uh, hopefully if we do our job, right, uh, a safer society that gets along and doesn't, uh, doesn't, uh, attack exactly. So, um, yeah. So as part of our coverage, we, uh, we're chatting with a few keynote speakers and I'm thrilled to have professor John Goodacre on. 
 

How are you, John?  
 

John Goodacre: Very good. Thank you, [00:01:00] Sean.  
 

Sean Martin: That's great. You're a busy man, researching stuff, involved in programs, uh, attending and speaking at conferences. Um, give us a little background on who Professor Goodacre is and what you're up to these days.  
 

John Goodacre: Yeah. So, you know, I've had a varied career. from telecoms. 
 

I spent six years at Microsoft in Redmond. I spent 17 years at Arm and it was really on the back end of that Arm stuff that I started obviously being a professor doing supercomputers, which we're not going to talk about today. But the other thing that I started was, uh, worrying about, uh, our cyber resilience and integrity of our systems and whether or not there's more we can do. 
 

than just patch and chase after bugs before the, uh, attackers find their way in and take over our system. So that's really what I've been doing for the last five years. And I've been doing that as a director of a UK government program. So I'm sure we'll get into more details of that as we, uh, talk this evening. 
 

Sean Martin: But, uh, John, I'm, I'm afraid that when [00:02:00] we say we're going to take care of the, uh, chasing patches around chasing vulnerabilities with patches, we're going to, we're going to take away some job security from folks, I think.  
 

John Goodacre: Oh, I don't think so. I don't think there'll ever be a silver bullet. I was, I was doing, I was, uh, Myself and, uh, a colleague, or an ex colleague, I suppose, I suppose we're still trying to attack this problem together from, from Armour giving governmental select committee evidence, uh, just last week, in fact, and we, he described it as a, a, a, an ongoing battle of an arms race against the attackers. 
 

So, yeah, I think that the big challenge really that we identified was that the, uh, that, that, Things haven't really changed for quite a long time, and it's just getting unsustainable for a lot of businesses and being able to maintain this. This is then obviously driving a skill shortage of people that can actually run around. 
 

So what we looked at was actually, is there a way of actually protecting people by [00:03:00] designing the technology? And obviously in the UK government's national strategy, we looked at bringing technology into the funding scheme. And it was that, that scheme that actually started this project. digital security by design program back in 2019. 
 

And what we wanted to do was basically say, can we actually fix the foundation of technology so that it The technology itself can protect us as opposed to leaving it to the billions of users that use our digital systems to actually try to manage and de risk themselves. So the program basically, uh, kicked off then it was, uh, at the time it was a 70 million investment by the UK government, but really it was to overcome a market failure. 
 

And because of that, we, we were basically the convenience, conveners. We're with big business and we have Microsoft, Google, and others all basically promising, you know, at the time over a hundred million pounds of co investment to say, can we fix the foundations of our [00:04:00] digital infrastructures so that, you know, basically we can reduce that load. 
 

And that's what the program started to look at. And, uh, obviously it was one that looked across the research part. It looked at the technology itself. It looked at what the impact of businesses were and, you know, pretty great unanimously that as people started looking at it, we can take out, well, Microsoft actually did a great paper. 
 

paper where they said, basically, if we look back in time at our last five years of patches that we've had to give out, this would have stopped 70 percent of them having to exist. So from that, you can start seeing that there's an ongoing cyber reduction. No silver bullet is not 100 percent or anything silly like that. 
 

But if you can take out 70 percent of the noise, Of, of, of this. And, uh, you know, since it, since that sort of started, it's, you know, we've, we've, we've come up some good language, it's starting to go global and things like that. So, you know, happy to delve into any of those areas you think, uh, [00:05:00] it will be of interest. 
 

And obviously in my talk at the show, I'll be going into more details of how we've, you know, Change the language into by design, by default, cyber security, a pyramid, if we can fix it, you know, with only a few hundred people looking at it and affect billions, it's, you know, how do you get a billion people not to click on the link quite hard? 
 

How can we make it if a billion people click on the link, nothing happens. So, so that's been our approach, uh, in the program, basically to try to fix things by design in the products themselves at the lowest level of the foundational. Technology in the CPU if we get, if we start getting techie, if that's interesting. 
 

Sean Martin: Yeah. Actually, I wanted to actually, uh, start there because I know you have a lot of history in, in, uh, the hardware, right? With Arm. Certainly. Yeah. Um, so that, that's, that's the base. And then of course you have the network, which is more hardware and, and, and networking, uh, protocols. And then you have the applications on top of that [00:06:00] software up, up to the users. 
 

So how far and wide does this program cover? Are you working with hardware manufacturers, software manufacturers? 
 

John Goodacre: Yeah, one of the things I'll be slide I'll be probably using at the presentation is something I call the cyber pyramid, and it's a reverse pyramid where we're balanced on basically, uh, that point hurts. It's sitting at that point. But at that point, that point of pain has actually been there for about 50 years. 
 

So, you know, computers back in the 40s and 50s, they were very expensive to build. The transistors, every transistor was really, really important. And obviously the cyber threat wasn't really there when we started. But by the 70s, even the American government was starting to publish things saying, we should probably change the way we run software on these chips, because any mistake in the software. 
 

There's a, there's a class of problem called memory safety issues. Uh, [00:07:00] a lot of people may have heard that coming out of the U. S. government in quite loud voice recently, uh, that, you know, this huge amount of, you know, general problems that people accidentally or, or as we've seen supply chain attacks actually start making problems in software that, you know, there's no defense in the hardware. 
 

against basically starting executing code, taking your data, putting you to ransom and things like that. So what we did in the hardware, or should I say, there was a University of Cambridge here in the UK have been looking for about 10 years. What can we do to change the way hardware runs software so that these memory safety issues are not exploitable. 
 

But also, how do we then put software, both the code and all the data, in very fine grained boxes? It's called compartmentalization, but the idea that we can protect, uh, or a developer can protect themselves against either explicit or accidental [00:08:00] vulnerabilities in third party code. And can we do this? In the way that people write software today, which is a bit of that software, a bit of that software, let's glue it together and ship a product. 
 

Yeah, nobody's going to rewrite the trillions of lines of code that they currently integrate in a different language. That still isn't, you know, making a significant difference to the risk of that system having issues in the future. So, yeah, what we've ended up with and what the program has been able to show in the last five years. 
 

is that we, yeah, we, we stop those vulnerabilities. We've shown that the developers can have a significant production, I'm sorry, productivity gain in the sense that things that may be slightly wrong in their design or their implementation are caught very quickly. And then obviously in the deployment of it, the fact that those customers don't have to run around patching quite as often. 
 

And, you know, things that today are very serious, Day zero bugs, you know, [00:09:00] let's look at the webp ones recently where somebody could just send an image to you by whatever means and take over your entire handset or computer. Not a good environment to be in, whereas the bug may have still existed but run it on this kind of hardware. 
 

It would have been nothing. Worst case of crash. Best case, they say, sorry, invalid image, you know, it's that kind of change that we're looking at bringing through to the platform and obviously in the, in my talk, I'll be covering more, you know, how does it do it? How do we get that into the supply chain? 
 

How, you know, Customers access technology, where's it actually coming from? And that going back to the pyramid, there's only really three architectures, the arm architecture, the risk five architecture, and the x86 architecture fix it in those three. And you've actually got the billions and billions of devices across the world in all sectors, all markets, uh, having this level of upgrade, if you like, to the way that hardware runs software. 
 

Sean Martin: Interesting. So I'm trying to, [00:10:00] trying to picture this. I don't want to demean it, but is it a shim between the hardware and the application? Or what's that look like?  
 

John Goodacre: Well, there is a shim. It's called an instruction set architecture. So basically hardware has a contract with software. It says, if you show me this instruction to add two numbers, then I'll have two numbers together for you. 
 

Okay. So it's at that fundamental level of hardware and software. If it says, if this value is greater than one, then jump else don't jump. It's that instruction within the computer that's changing and at the moment the computer has a view that if I've got an integer value, a number, then I can use that to point at memory and read the memory. 
 

Okay, uh, or if that's on the stack, I've got a string on the stack, a sequence of characters. Oh, let's make it a bit longer. Oh, those characters in our instructions, please, when you return off the stack, start running my code. So, those kind of things are basically inherent [00:11:00] in the mistakes of software. And I don't think anybody have found. 
 

No, I don't think I found anybody that will say there's never a bug in software And apparently, you know some fairly high integrity. There's a  
 

Sean Martin: reason the oas top 10 has lasted for so many years, right? Yeah  
 

John Goodacre: And about five or six of those top 10 Classes of errors are all of these memory problems, you know using something that was freed Writing off the end of a buffer, you know, open SSL, Heartbleed, reading off the end of a buffer. 
 

So I can go and get hold of your keys and your password. Those are all the kinds of things that if you don't just use an integer, but you use something called a capability, the hardware says, I can only read what I've been given permission to read or write or execute the very fine grained high performance way, then Hey, presto, the world's a much better and safer place to work and your systems are more resilient and the integrity of your data. 
 

And operations are higher. So it's not something that you effectively [00:12:00] buy and apply. It's not a tool. It's not a monitoring tool. It's not a piece of software that you run on your system. It's changing the way, you know, computers historically have run code to run it in a way that protects you against exploitation and allows you to just, you know, constrain you. 
 

you know, your image viewer. It's a good example because it's been quite popular recently as a way into a system. Put it in a box and you can only write to its image decode buffer. It can't go reading and writing the boot vector of your code and taking your, you know, your money out of your bank. So, so obviously, it's a different case, but yeah,  
 

Sean Martin: you know, there's all good things. 
 

So, yeah, so many questions in my mind. Go for it. Uh, let's see. So, let's go here first. So, actually I spoke with, uh, Bob Lord. He's with CISA, part of DHS. And, um, we, we talked a bit [00:13:00] about SecureByDesign software, that they have a, a patent on. Basically an effort underway to tackle the same issue around memory, memory issues in applications. 
 

My question is, are you working with DHS or other government entities to kind of make this broader?  
 

John Goodacre: Yeah, I've been over to the States and the agencies quite a few times over the years, sort of sharing this view that, you know, we can no longer just do cyber security where the user of a system is responsible and liable for its operation. 
 

We have to go to two, two lower levels. The first lower level is by default, where those manufacturers are taking more care about not chipping. such a large attack surface at us, you know, don't turn on features that don't, shouldn't be turned on and don't ship as default passwords. We've actually recently in the UK, law made that illegal to ship as a POS, a product, a consumer product with default [00:14:00] passwords now. 
 

And then the layer beneath that, where the products and the components actually by designing their implementation can help protect us. Now, I think that the, the language of, you know, user memory safe language is great. Uh, obviously that doesn't. Very well addressed the historical legacy of everything and what we're we're saying is great story. 
 

Here's a road map that can actually bring legacy under that protection as well. So and office and actually for even if you do start writing rust and other memory safe languages, you know they have an unsafe component to them run that rust on these platforms. You now have that. Unsafe component also protected from memory safety and this idea that even if you're writing in memory safe language or legacy language, you can start constraining and bringing in policies between libraries in your code between functions and data structures that again can operate at very high performance. 
 

We're seeing some of the, you [00:15:00] know, I'm sure. Yeah, our listeners have heard some of the vulnerabilities in our cloud stack recently and our virtualization things and the idea that you can now put that same software into these boxes and basically run it, you know, orders of magnitude faster. Uh, it's all sort of a positive sort of, uh, way, you know, let's design a computer runs code differently, can still run the existing code. 
 

The cost of change and the cost of adoption is minimal. You know, we've shown in this, there's actually more lines of code running on this platform than there have ever been lines of code in the open source for Rust, for example, and obviously the rate of authorship of legacy language non safe code is faster than the rate that they're writing it. 
 

back in new languages. So this is a divergent problem, not a convergent problem. So what can we do about it? And obviously my talk will be covering what's happening, what needs to happen in the future and what people need to know about it to be able to start of, you know, pulling [00:16:00] this through the supply chain and creating roadmaps in which, you know, the businesses themselves can obviously start benefiting from the reduction of that, uh, the threat against them. 
 

It's a de risking Uh, strategy for cyber security, you know, let's fix some of those root problems, not just band aid it with patches, you know,  
 

Sean Martin: or, or policy.  
 

And audits that  
 

take up a bunch of time as well.  
 

John Goodacre: Yeah. Well, clearly if we can have some, uh, language of, you know, it's In some senses, I think it's almost becoming socially unacceptable, isn't it? 
 

To be shipping products that can still be attacked in ways that we know how to attack them and You know, even school von memory You know, buffer overflow isn't, it doesn't take them long to realize that they can just send a few extra characters and start executing random code, you know, that kind of thing. 
 

Yeah. We've just got to get rid of that as soon as we can. It's now really a business adoption and commercial and policy kind of discussion [00:17:00] on how do we protect ourselves. You know, and it's a global problem. Yeah. How do all the governments get together and basically say, Hey, suppliers start doing this by design and by default. 
 

Otherwise the cybersecurity and the billions of people's and the businesses and the digital economy benefits are all at threat on there basically. And it's, uh, you know, I think we went from, was it 18, 000 reported vulnerabilities a year and a bit ago to 30, 000 this year, you know, people are doing a lot to reduce it, but the amount of digital code that's been written. 
 

Oh yeah. And the ferocity of people attacking systems, the number of vulnerabilities is going up exponentially if you look at the charts.  
 

Sean Martin: Yeah, I know I don't necessarily want to head down this path but the, the AI writing code, you're just going to amplify all those bugs, right?  
 

John Goodacre: Well, you know, well, I think there's a, there's a, there's a, there's a few things around AI, obviously, you know, AI can attack, AI can defend and tell you [00:18:00] what's going on a lot faster than a lot of socks can run around and see it. 
 

But then the AI is it's also, uh, an area that could be under attack itself. So if you've got an AI that's doing some. Recognition or is something wrong? It's running on this huge stack of legacy code operating systems. And, uh, you know, interpreters, you know, it's It's just a data structure in memory that can be tweaked by a buffer overflow in the decoder of the network stack. 
 

You know, it's, it's just as vulnerable as everybody else. It's not, you know, it's not the answer to everything. It's clearly going to escalate the rate at which vulnerabilities can be exploited or seen, but it's, you know, the, it itself is going to be vulnerable, but also, you know, if you start looking at the inference models between I own the model and that's my value. 
 

And I, And I want to sell it to somebody so they can spot smiley cats. You know, how do you protect your IP? [00:19:00] You know, because clearly you're under a threat that somebody can take your model and your business is gone. So, you know, this is where the containerization can help for AI as well. You know, separate the model from the interpreter from the rest of the system and you're protecting your IP as well. 
 

So, you know, there's lots of stories. And it's just a case of prioritizing which ones are the key ones that are going to cause it to fail. Be ubiquitous across the, uh, the whole of this sector, all the sectors.  
 

Sean Martin: Yeah. And I think, uh, one of the things that you shared prior to, uh, recording the, the This mindset that we have to establish and that starts with some awareness, with some training and education. 
 

Uh, can you touch on that a bit for me?  
 

John Goodacre: Yeah, so, so if I, you know, I said earlier that these problems have been in the computer architectures for 50 years. Okay, obviously not all of that time it was under threat and under, under attack. But [00:20:00] in essence how many people are as old as me that can remember that there was actually different types of computers back in the 60s and early 70s? 
 

Yeah, we had stack based list machines that didn't run with this idea that a single number can read all your memory. But that, you know, by the, by the Mid 70s, early 80s. That was the ubiquitous model that you have an integer number and you can read all your memory. Now, obviously, there's been things stuck on the side since and, you know, try to constrain how much it can read. 
 

But they're very expensive to implement a lot of those those partitions. So, you know how we take that forwards, it's, you know, up for discussion. But please, you know, I don't think I fully answered your question there. So go back and ask me again.  
 

Sean Martin: Well, I'm just wondering. So new stuff, I think we need to create with secure by design, secure by default in, in mind. 
 

Right? So yes, we have the legacy. We need to [00:21:00] come to a certain point where we're protecting a lot of that and then new stuff moving forward. Um, so how do we get to a point where we can scale secure by design, secure by default? 
 

John Goodacre: Yes, so obviously the by the by design for me. It's just geeking out for a moment. 
 

It's, it's all about abstractions and the leakage of data across abstractions. Okay, so the whole of our digital economy is built on abstractions, whether or not it's there. The way a transistor works in analog and electricity. So you can side channel attack through heat or whether or not it's the hardware software division or whether or not it's the operating system application boundary, or whether or not it's the driver, it's all the abstractions that are leaking. 
 

So, you know, in essence, it's bringing that into the education of the people who are working either to provide the abstraction. That's where the by design bit comes, or the people that are using abstractions, and that's where the by default comes. So if I'm a designer of a product that's [00:22:00] integrating and shipping a whole load of abstractions and a whole load of functionality, I better understand the, by default, I need to design this to be more secure. 
 

And if I'm shipping that abstraction, I'm shipping the transistor, the hardware chip, the network stack, the protocol interface, I better not do things that aren't by design working. So, you know, you know, obviously I've spoken about this memory safety by design, but there's, there's other ones. Where do I hold the magic key, the keys of the system? 
 

Okay. Have I designed it in a way that it just sits in that flat memory space that allows people to read it? Well, 80 or 90 percent of the chips, that's true today, but there's a growing number of them that are putting in routes of trust and they're putting in protected stores and things like that. So it's, you know, an educational thing that the people using it, those developers at the top actually knowing that says, well, is what I'm buying by design capabilities in [00:23:00] it. 
 

So that makes my life and protects me against any mistakes I might make in my level of the supply chain. Obviously the big recent news on the supply chain was that one that was going into the SSL stack for Linux recently. And again, you know, you could say, well, I want to use a library. I might not know its full providence. 
 

I better be able to put it in a box, protect myself by design that those libraries I'm using. There's no way they can get out and start doing things that they weren't expected to do. So, you know, we do, you know, we're starting to see that in the UK here. We've got something called side box. It's a long list of educational areas of interest. 
 

And we're feeding into that that becomes basically what the, the academia. The teaching community then spreads out to through their teachings. So, you know, hopefully we'll get this concept, but I think just getting the language and starting to see examples of this coming through the supply chain will [00:24:00] obviously hopefully pique the interest of, you know, component developers, the hardware developers, the system software, getting them between the application spaces. 
 

And then at the top, we're seeing it with governments reporting like, yes, this, uh, as you say, putting out Go and do this stuff. Watch your roadmap to memory safety, guys. Watch your roadmap to, by default, you know, in the UK here we've got our PSTI bill. You better make sure you know how to upgrade and support your devices and not put default passwords on them. 
 

So, you know, I think the language is getting there. I think there's a global alignment coming in that. Yes, there's a focus on what I can do today in writing in memory safe languages, but you know, like I said, there's an unsafe bit in that as well. It doesn't just because I wrote it in Rust doesn't mean it's memory safe. 
 

You know, it's, uh, it's an interesting one just because I've run it on a memory safe chip. It doesn't mean it's memory safe either. Well, some of the microcontroller stuff we're doing, it is, but, uh, you know, it's, uh, it's an educational. step. And obviously the reason [00:25:00] I'm offering to share my talk next week is to make sure that more people know about this, uh, this hierarchy, these areas of responsibility, rather than just saying cyber security. 
 

Have you packed? Do you know what your SOC is? Do you know what your risk mitigation is? Do you know what your recovery mechanism is? Yeah. Yes. It's very important. You need to know all those because you've got, attacks that aren't, aren't, aren't at technology level. But, you know, if we can do something about the technology level and the threats that they're bringing, then you can start worrying about the real ones, which is, you know, am I running the right functionality at the right time for the right people? 
 

Sean Martin: Yeah. Then we're going to focus on the, uh, the logic. Part of it.  
 

John Goodacre: Yeah. Yeah. It was funny. I was there was a big workshop that it was looking at the growth and the skills requirement for cyber security. So they were saying we need more people to understand how to protect their systems and how to run monitoring. 
 

I was going, look, let me paint that as a slightly different picture for you. Ultimate success. [00:26:00] 90 percent of your population in employment is running around patching your systems, knowing how to run it. Is that the economic growth and prosperity you want? And they went, Oh yeah, no. Okay. So what you really want is to make their lives easier so they can focus. 
 

You know, sustainably at what's required, which is, you know, offering great service and doing it in a resilient and high integrity way.  
 

Sean Martin: Yeah. It's funny. I've, I've had this conversation a few times and it, I don't know, I'm a nerd this way, it rolls around in my head almost 24 to seven that I believe. 
 

Security has knowledge and data and experience. If applied at a business level, just to, I mean, we spend tons of money on marketing, right. To fine tune the, the, the qualified lead to deal, right. If we, if we did that math. Looking at how do we, how do we tune [00:27:00] business operations, business processes so that they stay up all the time and they can't be compromised. 
 

John Goodacre: Security is a real hot, interesting one to understand what the economical ROI, return on investment for security is. Cause you know, we, we, you know, there's, there's one of our demonstrators that's basically converted the entire. stack right there from the boot up to the graphical browser. And it's in something called Cherry BSD. 
 

It's a BSD Linux like environment. And they tried to demo it and someone said, yeah, it's a computer. What's the value of that above that one? And it's actually very hard to sort of say, well, it's not as vulnerable. Yeah. And so, you know, I'm finding lots of businesses and sectors that go and yeah, my mitigation is in place. 
 

If I get attacked, I know how to recover. Tell me again, why I need to change what I'm doing and it, you know, how do we get them to realize that, you know, [00:28:00] it's not the attack you've not had, it's the one you're about to have and, you know. It's that threat and that cost that's going to really struggle. I think I saw a number from, uh, was it McKinsey recently that it was around 10 trillion lost in IP and the tack and maintenance out of the global GDP. 
 

Okay. 10 trillion. That's a freaking heart. If we can have that, that wouldn't be bad, would it? Yeah. That put a few pennies in everybody's pocket. 
 

Sean Martin: A few bob in my pocket for sure. Ah, well, John, um, fascinating. I feel like we can continue to dig deeper and maybe, I don't know, maybe after, uh, InfoSecurity Europe, we can do that, have another chat. 
 

I,  
 

uh, I definitely I'm excited for your, for your talk, uh, progress for the. DSBD initiative, which is what we've been talking about here today, and the Cherry capability hardware, which is what you just referenced. That's on the keynote [00:29:00] stage, Tuesday, the 4th, 1230, half 12 there, uh, local time. And uh, yeah, I'm excited to hear, uh, hear what you have to say. 
 

Yeah, with pictures  
 

John Goodacre: rather than just my hands waving. 
 

Sean Martin: Ooh, look at that. Not just hand waving. Italian style. 
 

John Goodacre: Hopefully we'll catch a few people on this and I'll be around all day so you can't miss me. Bright red head, yeah, I've got a red shirt. There you go. Long red head. Red head, red shirt. Yeah, come and say hey John. 
 

I heard your talk I'd really like to know about this and I'll say yeah yeah yeah and come and see me and yeah we'll change the world that was the first thing I did when I started the program I stood up in front of it was a fairly small ecosystem at the time about 20 people we're here to change the world you know and stand up now we've got about four or five hundred in the ecosystem so yeah we'll make it everybody at some point with awareness  
 

Sean Martin: Right on. 
 

Well, John, it's been a pleasure chatting with you and, uh, congratulations on getting this spot to raise awareness for this at the conference [00:30:00] and, uh, looking forward to the, to the presentation and the conversations that, uh, come with it after. And, uh, you're very welcome back anytime. And for everybody listening, please do follow. 
 

We have lots of chats on the road to infrasecurity Europe in London. Uh, Mark and I will be there on site, uh, to catch up with folks as well. Lots of stories, lots of cool stuff going, including. What John's talking about, uh, Tuesday, the 4th, half 12 there. So, John, thank you. Thanks, everybody. And thank you very much. 
 

Goodbye.