Welcome to Redefining Society And Technology, the podcast where we take a close look at how technology and society intersect, collide, and ultimately, shape one another in ways we sometimes overlook. In today’s episode, we’re diving into a conversation that cuts right to the heart of these connections.
Guests:
Fred Heiding, Computer Security Specialist, World Economic Forum [@wef]
On LinkedIn | https://www.linkedin.com/in/fheiding/
On Twitter | https://twitter.com/fredheiding
On Mastodon | https://mastodon.social/@fredheiding
On Instagram | https://www.instagram.com/fheiding/
Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]
On ITSPmagazine | https://www.itspmagazine.com/sean-martin
_____________________________
Host: Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast & Audio Signals Podcast
On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli
_____________________________
This Episode’s Sponsors
BlackCloak 👉 https://itspm.ag/itspbcweb
_____________________________
Episode Introduction
Hello everyone, Marco here. Thanks for joining another thought-provoking episode of the Redefining Society and Technology Podcast.Today, we’re exploring the ripple effect between society and technology—a back-and-forth that shapes our everyday lives more than most people realize. Joining me for this conversation are two thoughtful guests: Sean Martin and Fred Heiding, each bringing a unique view from the trenches of cybersecurity and beyond.
Technology and Society: A Two-Way Street
The conversation kicks off in a bit of a role reversal—this time with Sean on the other side as a guest, and Fred, who’s no stranger to the podcast, rejoining us to share perspectives shaped by years in both academia and industry. Together, we dug into a foundational question: How does technology shape society, and how does society shape technology? It’s a question that lies at the core of our podcast, and Fred brought up some great examples of how national cybersecurity policies reflect deeper societal and cultural values across the globe.
The Influence of Capitalism on Tech Advancement
As we peeled back the layers, the conversation naturally turned to capitalism's role in technology’s relentless drive forward. There’s no doubt that economic incentives can fuel remarkable innovation, but they also raise critical questions. Are we designing and creating technology solely for profit? Is there room for ethical and societal concerns to play a larger role? This tension is particularly evident in areas like artificial intelligence, where financial motives might sometimes overshadow the broader social benefits we’re hoping to achieve.
AI in Cybersecurity: A Double-Edged Sword
Speaking of AI, the conversation wouldn’t be complete without discussing how it’s reshaping cybersecurity. Fred laid out a compelling look at the dual role AI plays in this space—enhancing our defense strategies, yet potentially giving new tools to attackers exploiting human vulnerabilities. While it might sound daunting, I left the conversation feeling optimistic. AI could become an accessible, universal shield, offering protections that adapt to everyone’s needs, tech-savvy or not.
Looking Toward the Future
We wrapped up on a hopeful note, looking ahead to a future where cybersecurity goes beyond merely responding to threats. Instead, we envision a world where technology anticipates challenges, creating tools and strategies for the betterment of society. After all, if we can use technology to crack decades-old cold cases, who’s to say we can’t use it to secure a better future?
As always, my goal is to leave you thinking—questioning the status quo, exploring the labels and promises tech offers, and considering how it all lines up with our values and goals.
Subscribe and Stay Curious
Huge thanks to Sean and Fred for joining me today. And to everyone tuning in, make sure you subscribe to Redefining Society And Technology Podcast and our ITSPmagazine YouTube channel. There’s plenty more ahead as we continue challenging ideas, redefining technology’s role, and asking the questions that need answers.
Until next time, keep questioning everything and stay curious!
_____________________________
Resources
_____________________________
To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast
Watch the webcast version on-demand on YouTube: https://www.youtube.com/playlist?list=PLnYu0psdcllTUoWMGGQHlGVZA575VtGr9
Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/sponsor-the-itspmagazine-podcast-network
Cybersecurity as a Proactive Societal Safeguard? How Cybersecurity Shapes Society and Technology — and Vice Versa — and The Other Way Around! | A Conversation with Sean Martin and Fred Heiding | Redefining Society Podcast With Marco Ciappelli
Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording, as errors may exist. At this time, we provide it “as it is,” and we hope it can be helpful for our audience.
_________________________________________
[00:00:00] Marco Ciappelli: Hello, everybody. Welcome to another episode of Redefining Society podcast. If you're watching, you may be a little bit throw off by this because you see Sean is here and today he's in the in the role of a guest because it's my show and sometimes I go on his show. So it's kind of fun. A little less pressure, I guess, on your on your side.
[00:00:24] Sean Martin: Yeah, I'm just here. I'm free loading on the session today.
I love chatting with
Fred, and you happen to be the host of this one.
[00:00:33] Marco Ciappelli: I know, I know, I know. I was going to say about Fred, that you're like a At this point, returning guests for, I guess, three, four times, maybe, if you count last year, RSA.
Yeah, yeah. So, familiar face. I love, honestly, to have returning guests because you already know the style. You already know what they're good at. And, uh, even if this time you're not under the finance cybersecurity and we're not at RSA conference, we are where we talk about how technology impacts society.
And hopefully how society impact technology, which is what I really like. And, uh, and of course you cannot talk about technology without cybersecurity and you are an expert in that field. So, uh, we all know that Sean is crazy about cybersecurity. He loves cybersecurity. Eat, breathe, and I don't know. I'm crazy.
So anyway, this is going to be a lot of fun and I think it's going to create some questions in people's minds, but also hopefully give some, uh, some answers and some good tips. So let's start with, uh, Fred, uh, introduce yourself. Uh, I know you do so many things that, um, I'm not even going to attempt to, uh, you can do a better job than me.
[00:01:55] Fred Heiding: After that, thanks again for having me. It's, it's always an honor and a pleasure to come back here because it's so interesting sessions. And I think that I always learn a lot from, from talking to you guys and having your questions pop up. And it's, I really like the name of this, uh, this podcast. Actually this week I had, um, had a friend telling me, he really issued me to leave academia and start up a business.
And I'm not going to do that immediately, but he said, Fred, like, what's the, what's the purpose of all this research? If you're not sure that it's, you know, redefined society, they literally said those words, right? Like you have to do something, you have to get your research out to the people. And I kind of believe to some degree that happens, but of course, that's always.
It's a difference between academia and research and industry and to think about how, how do what we do reshape society and redefine society. I love that name. And I think I'm pretty excited about where this conversation will, will lead. I actually don't know. It's kind of, I think we're going to learn, learn something, but yeah, Fred, I'm, I'm just wrapping up my PhD in computer science.
Also just about after that to start a postdoc at the Harvard Kennedy School, which is a political school over at Harvard. So that's quite exciting. I've been doing computer science for almost all my life. Although the last couple of years I've been working a lot with phishing which implements some psychology.
I've been doing some business oriented stuff and also been trying to investigate what's the actual cost of phishing attacks and how much should What should organizations spend? That's a very big gray zone right now. We don't know, in the general bigger sense, we don't know what cyber security is truly worth either.
I think that's very interesting. What is the real cost of a cyber attack? And this last year, as I worked a lot with strategy, a lot with policy, and seeing how do you take these different technological cyber security solutions and use them in a way that affects society and helps society. And what, what should technology use?
And there's this trade off, right? When you start talking to people, the security operability organization. How much are people willing to give up in terms of convenience, you know, for having a secure product? Because it's just not convenient to have a secure product. And these things more and more starts to play into national security.
I think that's quite interesting. And yeah, I love these questions. I'm very sort of interested in the turn that my research takes now. So I'm, I think that I'll move a little bit closer towards the redefining society way, if you still want to put the label on my direction. And I'm, I'm excited about that.
So that's where I am and I'm really excited to be here today.
[00:04:16] Marco Ciappelli: Yeah. And we're gonna dig into, again, the motivation, how you, you felt attracted to that as a, as a technical, more technical person. Sean, I know that you're interested in society because I made you interested in society as you made me interested in society.
I dragged you in
[00:04:32] Sean Martin: cyber and you, you, you made me think about, well, what, what,
[00:04:37] Marco Ciappelli: what, what do you expect from this conversation, Sean?
[00:04:41] Sean Martin: I expect to change the world. . There you go, Jason. No, I think, uh, in, in all seriousness, I think just the, the idea, I mean, when we founded the magazine, ITSB magazine, we brought together the technology and the cybersecurity and how that, how it impacts how we live, work and vice versa.
So for today, I think I'm going to, I mean, what I look at is business operations primarily, right? How do businesses run with security and privacy in mind? But not in a way that impacts growth and revenue, but in a way that impact or generates money and, and trust in, in their user base and customer base.
So it's that latter part, right? The user and the users and the customers, um, that I'll, I'll bring to this as well, maybe not just communities, but users of technologies who happen to be communities, but also members of, or employees of companies, partners and companies, things like that. So I was excited to come with Fred, as I said.
[00:05:47] Marco Ciappelli: You know, it all comes together. I think this is the perfect setup for this conversation, right? You can't just get something and put it in silos, which is terminology we use a lot in cyber security. We realize that everything has is synergic. There is synergy between. What we do, what we invest, what we develop, the way we grow, and you can't just do it because of the money.
Although, unfortunately, money is what also found research as well. So we gotta find the balance there. Um, we need another society, the other society, which we can go into what the goal of that is. But you know, a society that actually do cares, I think, about. And the question that that your friend asked you, Fred, why we do all this?
Right? So why don't we start there? Why? Why? That was such an important question for you and and how you like this for defining society concept.
[00:06:48] Fred Heiding: Yeah, that's a great, a great question. One thing I really like that you guys said just a couple of minutes ago is that it's not just about how technology changes society, right?
It's about how society changes technology. And a while ago we discussed a little bit on this podcast, one of the studies I did where we compare a lot of different countries national cyber securities, which is quite interesting. Most modern countries have a An official strategy, which is a document that entails their sort of political ambitions to work with cybersecurity.
And I've been working a lot with these strategies and the policy makers who creates them during the past years. And one thing that we learned there, and I was quite curious to find this out. What are these documents? And one thing you learn is that they are in every sense of the word, sort of a product of society shaping technology, like these documents, if you look at the American cybersecurity strategy or the Japanese cybersecurity strategy.
They're different, not just on technical levels, but they're, you can see the political differences in the, in the countries by analyzing these strategies. And I think that's quite cool because what that means in practice, and these documents are actually used, right? They, they really affect the, um, the cyber climate of the country.
The political culture, the society is affecting the way we work with cyber security, which is very interesting. I briefly mentioned this before, but for example, we can see that Japan's entire cyber security strategy, the way the motivation for them to even create it to some degree, was to protect the vulnerable people.
Like, they really wanted to know that everyone, they have the term cyber security for all. And that's, That's what cybersecurity is for them. To some degree, this is a big question, but you can say that and if you look, for example, in the US, we have an enormous protection of critical infrastructure, of large businesses, almost no mention about elderly and vulnerable populations, and that doesn't come from technical limitations, right?
That comes from just sort of political strategies, and they come from something else affect cybersecurity. I think this is. It's incredibly interesting because when you look into these things, as you said before too, cybersecurity research, of course, it's the backbone of this money and grants. Those grants often come from governmentally tied organizations.
And I think these sort of high level political thoughts and strategies and philosophies of how the country should be run They take a lot of different shapes and one of those shapes is cyber security. And by analyzing these high level strategies, you can also see that, you know, maybe the, the reason that there are so many successful online scams that hurts, hurts a lot of people, for example, in the US, but around the world is that these vulnerable people don't receive the same rigorous support in terms of cyber as, for example, critical infrastructure.
And of course, to some degree, that makes sense, right? We need to protect our power plants and so forth. But. It would be good, I really believe, if it was a better way, a more clear way for everyone, especially citizens that are not so tech savvy, to learn about scams, to know that, you know, if someone hacked me.
Everyone knows who to call if there's a fire in the house, but everyone doesn't know who to call if someone hacks them. And I think that these questions really interest me. And how can you sort of convert? We have a lot of really good technical solutions, but how do we spread them to ensure that everyone can be part of them?
Uh, I think that's, uh,
[00:10:13] Sean Martin: I tend to look at it, of course, every country has a different culture, which drives what, uh, what they care about most. And, and, in the Japanese example, the society does take care of that. Take a top spot in what they care about, right? Having respect for each other and taking care of each other. I look at it from a business perspective.
Maybe that, and I don't know, maybe if the U S approaches things from a business perspective, what, how can we grow the GDP? How can we keep, uh, the, uh, the deficit down? How can we help companies and individuals grow and succeed in their careers and in life? Um, where we really push, I mean, we have patent offices and all these things that really push for innovations.
Uh, we have a market that, uh, that drives and supports first to market and, uh, and innovations that may push the limits on what's safe is. And so I, I'm just wondering how, how a, I don't know if it's the market or the culture changes some of this stuff. Cause when we look at a business, oftentimes. Uh, the, the culture and the, and the values of the business drive, how they build products, how they deliver services, how they treat their employees, how they interact with their customers.
Do they take a privacy first approach or do they just collect everything? It's a tough balance, I think, and I, I guess, how do we list it all? How do we prioritize it all? Do some countries not list it all? They just focus on what they think is right and leave other stuff that they haven't thought about off the table.
Yeah. So, those are my thoughts.
[00:11:59] Marco Ciappelli: The answer to that question is We don't know.
[00:12:02] Sean Martin: I know.
[00:12:07] Marco Ciappelli: Here's what is interesting. I'm reading a book about, you know, the history of humanity. Really cool book. Maybe I'll put it in the, in the note. And, uh, and when you get to the capitalism part that drives Western Europe in the 1700 and up, it's grew by the idea of capitalistic society, right?
You know, the, the, the growth. is the objective. So you reinvest the money, you know, Adam Smith, the wealth of the nations and, and, and you create wealth because you do that, not just because they're greedy, but because you're making society better. Like if everybody's more rich, everybody can enjoy life more.
If there is more research, people can live longer and, and all of that. Now, some people, I think they do it. With the right idea in mind, often the researcher, the scientist, but the money come from people that want to make money. So there is a lot to think right, right there to what drives innovation, I think.
And, but, but I think we're at the point that the change into society is so big in terms of technology with artificial intelligence and all the advanced technology that we have now that. I think we're at that point where we need to reestablish how we do things again. I don't know, Fred, am I mumbling or am I, you see something here?
[00:13:41] Fred Heiding: No, I think this is super interesting. And it's also, I think it's difficult to have this conversation without sort of digressing and pivoting away a little bit, but I recently had a lot of conversations with a bunch of folks from the AI community about artificial intelligence and capitalism. I think that that's quite interesting, right?
Because when you start, and it's relatively tied to computer security, in the sense that everything sort of relates to these topics in the context of scaling up technology and security. The rule of capitalism is very interesting in this because there are a few like obviously for my sake I moved from Sweden to the United States and I'm quite big believer in capitalism because otherwise I wouldn't have moved here.
Um, but, but when you start looking at the sort of underbelly of these things, you see some trends. And then, for example, with. with labor replacement, which is a big, um, big sense in, in the AI communities these days, what happens when, you know, we replace a lot of human jobs. And it's, it's interesting again, and this comes back to, you know, how society shapes technology because when you create, for example, AI models, uh, You create them to a large degree based on some sort of a capitalistic undertone, right?
And that's how they're used. And they're used to profit maximize. And whatever you say, you know, there's a lot of stories from all these companies. But you create products to profit maximize. You almost always do that, right, in a capitalistic society. And is that ideal when you talk about superintelligence?
And that's a little bit of a leap away from computer security. But I think that's, you know, It's quite related because a large part of my research is also controlling the end models and see that that's a big and also that in all these, of course, national cyber strategies, a large part of them is emerging technology these days.
And how do we ensure that we're secure against tomorrow's end models and other schemes? And I think that's quite interesting. And it's, it gets. It's a little bit vague when you talk about these things, but it's also very interesting to see because again, when you have, yeah, you talk about developing superintelligence in a large case, and then you, you know, you test security people with protecting us against that superintelligence and in a larger case, protecting society against that.
And I think that's, it's very interesting because in a way you create something with the sole purpose of protecting us. Not the sole purpose, but with a strong purpose of profit maximization that will very obviously benefit a few owners incredibly much. And how, how do you secure that and ensure that society benefits and is, is sort of shaped in a good way?
And it's almost like society shapes the way we create these technologies and then these technologies shape society. And it's a self fulfilling sort of prophecy or loop. Whereas
if you
[00:16:22] Fred Heiding: had another context, I mean, you can imagine a thought experiment with another type of society, with different values, different principles.
And maybe in that society, technology would be created in a bit of a different way. Models would be trained in a different way, used in a different way, created for different purposes. That would create another loop. I think these things are quite interesting because we live in an exponential curve, right?
It goes very, very quickly. When you live in an exponential curve, there's a lot of change and decisions you take have a lot of consequences a couple of decades or even years down the road. And that's an incredibly vague answer to your question. It was already a little bit vague, but I think that's, that's a good answer.
That that's, yeah, that's my 2 cents on this.
[00:17:06] Sean Martin: Yeah. I, I, I don't, you've said it a couple times now that society's creating the technology and maybe in, in some countries that is the case. Um, but I really feel that there's one or two big companies, one or two individuals. Perhaps even that have strong views and on what they want to accomplish, right?
And, and we can look at Amazon and, and, and the SpaceX group of, group of companies. And, and the things they're investing in are very broad and wide and covering large swaths of how we live our daily lives. And, and in underneath that is the, so I think companies and people that run those companies What's being built for the sole purpose of, of making them money, thinking that they are doing hopefully something good for the society.
Then underneath that is the innovation of the technology that pushes things further, maybe even beyond the views of, of the people at the top, be it an individual or an executive team. And I think the technology is pushing things even further. And in both of those cases, I don't see. Individuals having a say in it.
I think we're, we're tossed new products and features that, or even forced to use if we want to gain a benefit from technology and products and features that we don't really have a say, at least we don't think we have a say. So I guess my position is I don't think society's defining technology. I think technology is forcing society to.
Adhere to what a small number of folks want.
[00:19:00] Fred Heiding: So I really, I really agree with that. And let me redefine what I said. So I think we're actually saying the same thing, but what I saying if you want to rephrase a little bit better, is that a part of society defines technology. Cause again, these, let's say large companies, they are society, but they're not the whole of society.
They're a small part of society. And it is actually a very important distinction that you makes, right? That a part of society, a small part, a very small part is creating technology for all of society. Right. And that's quite interesting. And that's, uh, that that's, it's very important to make it clear as you do that.
Yeah. Uh, minimal actually part of society creates technology for the big part. And as you mentioned, that can be good, right? In a lot of ways that has created also, as you, Marco mentioned before, it's created a lot of benefits in the last two, 300 years. There's a whole lot of things that's, that's a lot better in society, but there is also just a play with these faults again, just throughout the concept, but if you, if you look at terms such as Labor displayed.
I mean, I mean, capitalism is fantastic as long as people are useful, right? But if people are not useful, which a lot of good good folks believe that people will not be useful for the concept of profit generation for so much longer and if they're not then That's quite interesting, right? Because what happens with the world then and what happens with security then?
You know, there's far bigger topics than cyber security in this context context and I think um, yeah I think that's that's interesting questions
[00:20:27] Marco Ciappelli: Wow, I'm really loving this conversation. I feel like I'm home. No, but the truth, I feel like that when you presented this alternative view between the society driving versus few driving, the point is that you're, I think you're both right.
There is, there is both into, into play because again, it's all a synergy of things and one cannot exist without the other. Now, A lot of people say that the consumers, as a society, make decision because if they don't buy the product, then that product is not valuable anymore and that company that makes it, maybe it's going to go out of business.
But I think that when you reach a certain level of a size of a company that you don't even know where they're investing anymore, I don't think that the failure of a product is going to make a difference. The failing of the company. So here can go a little bit of economic philosophy, maybe in this conversation.
But I spoke to someone one time that in his opinion, the solution to, for a better technology is to actually have a smaller company. That are innovating instead of few large company that do innovate. Now, maybe somebody is going to call me a socialist right now, or, or, or a Marxist or whatever, even if they don't know what it means, but the point is, it doesn't mean that we need to let this company become this incredible monster that then in the end will make the decision for us.
I feel like sometimes the, the everyday person doesn't have a choice. And that's a problem.
[00:22:23] Fred Heiding: Yeah, that's interesting, right? And that's something, we actually analyzed that quite a bit lately, in terms of incentive structures for different ways for nations to work with incentive alignments in terms of computer security.
One way, as you mentioned, is to try to incentivize consumers to buy secure products. There's usually a premium. You need to pay a good chunk of extra money to have a secure IP camera instead of a normal IP camera. And there are unfortunately a lot of cases where you buy a secure product, but you're still able to hack that product.
So it's not always, I guess you don't always know what you're paying for, but I think that's, we haven't really cracked that yet. There's some, a lot of certification schemes where you have this new secure, cyber secure product certification. And I think that's good. I kind of like the food metaphor. You know, if you, if you go and buy something at the store, there are all these labels and certification for having food that is there, there's some amount of check to make sure that that food is healthy, uh, way more so in Europe than the U S but even here in the U S we have that.
And it's, there's a good parallel there, right? To security and see that if. If we know that there's a label on this product, we know it's secure, it's gone through all these tests, then it's probably going to be a bit more likely to buy it, but, um, certifications, especially in the context of computer security is no panacea or, you know, solve all solution either.
It can be the case, and there's a lot of criticism saying exactly this, that if you have a certification, then companies will always buy it. Optimize to just pass the certification rather than creating a secure product, right? They do the minimum effort in that case and that might be good because at least they need to do something Uh, but there are other ways too As you said if in a perfect world if every user care 100 about cyber security So they always buy secure products if they're an insecure product, they're not going to buy it That would be good and then companies would be incentivized to create really secure products but That's almost a saying is every if everyone bought healthy food, no one would sell unhealthy food But that's not just how it works because there's there's a common trend of just taking the shortest path and saving a few dollars You know to get a cheaper product
[00:24:36] Marco Ciappelli: or there could be the people that just cannot afford the more expensive product so you either bring down the price and everybody get the good food or Where everybody in this case get the secure product.
That's a
thing.
[00:24:50] Sean Martin: Well, so, toward the beginning, and I know it was on one of the points we wanted to touch on, we talked about, um, I don't want to say underrepresented, but vulnerable folks, right? And I know there are, for people that can't see well, or are blind, or even just colorblind, people that can't hear well, or at all, there are laws, uh, that I think certainly, need Some state level, perhaps some federal level.
I'm not too, too versed in it all, but I know that there are things that we do to say, you can't just hit the 90 percent of the market and ignore. Folks with disabilities because you don't want to serve them. I don't know. It's not a direct Security falls in the same category, but I'm just wondering if there's something to be learned from this Where we can say we're all vulnerable Right.
We all have we all have this Flaw to fall for scams. Why are we not? Doing something to help, help folks. And maybe there's a scale for all the elderly folks who aren't as tech savvy or I don't know, but I guess my point is we, we seem to have some answers for some things. Um, the example I'm talking about now is government driven.
I don't necessarily know that we need to do that, but, um, uh, yeah, I think we seem to find solutions to these problems, but this one, we just keep talking about it a lot and can't seem to unlock the code on it.
[00:26:22] Fred Heiding: Yeah, no, I think that's very true. And I think, unfortunately, some degree, what you say is, is just due to the way the market is shaped.
I don't think the markets have to be shaped this way, but I talked to a lot of companies who have a lot of, you know, these various scam protection and security solutions, and a lot of them solely focused towards B2B business to business, uh, sales, you know, because of course the big, the big money is making a big sale to the companies, but that makes sense.
Cause they spend a lot of time and energy developing these various solutions to, to, uh, Perhaps train people to be more secure or more sophisticated, uh, scam detection systems. And I think that it would be good if there was a bigger support and hopefully eventually the market will adjust itself, right?
The same way we have in most system have a firewalls and so forth since very long time ago. Cause the private people would need that as well. And if they, at some point implicitly they demanded that then that was required. But yeah, we don't have that with scams. We don't have that with cyber insurances for private people.
It's. And a lot of this is just due to that there's just a shortage of information and also there's a shortage of some really good decentralized local law enforcement resources and competence into dealing with these sort of low level cyber issues. And I think, for example, in the U. S., it's pretty good.
Good response. Now, if you're a big company and you get hit by a ransomware attack, you can usually get pretty good health by the FBI or so forth, but if you're a private person who knew just pretty good scam and you fall for it and you lose a good chunk of your savings or other things, then it's less clear how you will be, you know, how you will get assistance from that and how people will try to define the people who did that to you.
Of course, we're working on that, but I think it's, um, it's just not there yet. Uh, in the sense of, uh, how protective people should be.
[00:28:22] Sean Martin: Yeah, and I like this quickly, Marco, because I also think about, I've been doing this for 30 years and at the beginning of a lot of the creation of the cybersecurity market where antivirus became a thing and then it exploded to all the stuff we have now.
Um, I guess the point is we've, we've created a market to solve a problem that we didn't solve initially in the creation of the technology. And I guess we can look at it. In a parallel for physical security, right? There's security patrols and community security services and things like that. They do the same thing.
So there's markets for. Cameras, security cameras and security patrols and all that kind of stuff. And so I guess it's a one to one physical to digital. The question I'm thinking about is we have the mitigating systems in place because we either don't have the proper laws or the enforcement of the laws or the responsibility for those who break the law or allow the technology to be broken against the law.
[00:29:32] Fred Heiding: Yeah.
[00:29:33] Sean Martin: And. So I don't know if there's something in the, in the work that you've done from the national policy perspective that says some countries actually have better laws and better enforcement and better responsibility, uh, positions and, and things to help kind of balance things out on that end of it so that there's less of a need for the, you know, Mitigating market, in a case.
[00:29:57] Fred Heiding: That's an incredibly good point. I think one thing, uh, one thing of a problem here, I should say, and that's sort of coming across the board against a lot of nations, is that in cyber, attribution is very difficult, right? If someone hits you with a cyber crime, there's so many ways, both for big attacks and for small attacks, but there's so many ways these criminals can, you know, first of all, just hide their traces, right?
It's less obvious who did the attack. And, you know, It's so common in these attacks. You find a group from one counter and then they kind of spoofed their attack methods. For example, you know, Russian hackers trying to look like North Korean hackers. And then you, you think that you attribute the attack to North Korea, but it turned out to be Russia and so forth and so forth.
But it's, it's really tricky. And as you said, it's, it's not just, I mean, obviously we just normal crime. It's really hard to find the crim, the person who did a crime, right? But you, you can, and we have, have really good police officers who try to do this and detectives and so forth. And we have cyber detectives too, uh, to use that language, but it's just way more difficult.
And yeah, there are, there are different national approaches. For example, again, these sort of correlates to the political structure of the country and Singapore. It is well known for having way more of an affordative philosophy than, um, for example, many Western countries have it. They have a way stronger surveillance culture, which is, it's quite interesting because yeah, that, that surveillance count surveillance culture makes it a bit more convenient to try to, um, stop criminals before they can perform cyber attacks.
And when cyber attacks have happened, they have a stronger chance to, uh, to catch But even there, it's difficult, right? Because it's global. It's not the same as You go and you, you, you rob your local supermarket because then you have to be there in person, you know, and it's way easier to find them. But the fact that there's a global phenomenon is, is very, very difficult to, uh, to achieve attribution.
If you can't achieve attribution and penalize people, of course, people are not going to stop.
[00:31:57] Marco Ciappelli: So here's the thought as we go towards the end, and I kind of want to open the Artificial intelligence can but I don't think we have the time for that. So maybe you come back and we talk about that. But I'm kind of going into weirdly, in a weird way for me, unusual, you know, in a, in a positive view of the future based on what you guys have said, because I tend to get dystopian and create this, you know, Fantasy world where everything goes to shit.
Sorry for the French. Uh, but in this case you make me think like, you know, we're still very relatively in an early stage of the cyber security industry and I and I know a lot of people say that and even the internet itself and now AI But but I feel like I'm running parallel in my head with other things that happen in the past and it comes from Fred You're bringing in The security in real world of how you, I mean, we are reopening case now that, you know, call cases for 20, 30 years ago, where you can still access the DNA of the person incriminated.
And then you find out that it's not really the person that did the crime, which sucks. But, you know, the technology is bringing us to a point, then we can now resolve certain things that 30, 40, 50 years ago, we couldn't. Jack the Reaper, we still don't know who he is. We probably, if we did it now, we would have found it because the cameras and because all of that in London, right?
So my point is, I'm kind of optimistic that security, cyber security will get better and maybe will access and, and protect everybody without expecting everybody to learn how to protect themselves. I don't know in how long, but, but I think. AI, it's probably going to help on that. So sure, a comment on AI, based on what I said.
No,
[00:34:00] Fred Heiding: that's an incredibly good point. And actually, I'm happy to maybe link that in the end notes. I wrote an article in Harvard Business Review earlier this year about this very topic about how AI benefits attackers and defenders. And it's quite context dependent, but you have a really good point. And in the term protecting, Systems, hardware, software, and so forth.
I do believe that AI will help defenders potentially way more than they help attackers For a few reasons and one of these reasons is that a lot of the cyber attacks we see now, especially big attacks They are very complex But they also almost always utilize the true loo hanging fruits. There's perhaps some really poor security settings.
Your devices aren't configured properly. And even these really big attacks that cost billions of dollars in damage, they are enabled by a few really poor security settings. And I think a big reason why these poor security settings of these poorly secured devices exist is that, again, there's a shortage of skilled labor in cybersecurity.
We don't have time to test all the devices and to perform as many security assessments as we would like to do. I do believe that, of course, attackers are going to use AI to launch a lot of cyber attacks, but defenders will also use it. And if defenders use it to increase the sort of lower bar of cyber security, I mean, that's going to benefit the society a lot, and I'm quite optimistic that.
The flip side is when it comes to scamming people and hacking, you know, humans. Then I do believe that AI benefits attackers more than defenders because AI makes it far cheaper for hackers to launch different scams. But we can't patch the human brain, right? It's not, it's not the same. So with systems, we're just going to patch all our systems.
That's very, very good. With humans, the attackers are getting this fantastic new shiny attacking toolbox, but humans, you can use AI to train humans and you can use AI for spam filters. And that's great. But Old AI models are already really good at spam filters, but large language models, which is this other type of AI, it's only sort of incrementally enhances the spam filters from what we had before, but it almost exponentially enhances the capability of attackers.
So short summary is that protecting technical defense systems, AI is great. It benefits, um, defenders more. Protecting human users from scams and so forth, then AI is not so good and it benefits attackers more. Well, I have loads of thoughts as a closing.
[00:36:31] Sean Martin: I recall a conversation we had with Charity Wright, and the concept that we'll see not just an Internet, but many Internets, and to me that says many networks, and what I take that to is the ability to use technology In a way with purposeful for the thing we're trying to accomplish, purposeful with the group we're trying to accomplish it with, and therefore it's still connected globally, but more isolated, uh, in different groups.
And so I think the technology is there to allow for us to kind of protect ourselves. I know there's like identity technologies and all the web three stuff and, and how we manage our data, who owns the data. Those types of things are coming where I see we have a better chance to protect ourselves, connect ourselves to only the things we want, bring attribution to the conversations, interactions we have with each other, be it people or machines.
And so I see a future where the technology allows for us to have a safer place that's still global, but probably more, more finite in terms of the groups that we were part of and connect with.
[00:37:52] Fred Heiding: Yeah. I think that's a beautiful way to put it and I like that.
[00:37:59] Marco Ciappelli: Well, I like the fact that we had this conversation.
I like the fact that I have more questions in my head than when we started, which is a good thing. As I always tell the audience, even now at the end of the, in the final bumper, the outro of, uh, of the Redefining Society, that's exactly what I say. It's, uh, we need to question everything. We, we just cannot assume that even to go back Fred to the label on the product, maybe that company made the minimum effort just to meet the label and maybe could have done something better.
So we are the one that needs to really read through and realize. It's kind of like, you know, we, we have to think sometimes we don't have the time and so we rely on technology to stop, you know, the scam or the spam filter and I'm sure they're going to get better. I'm sure AI is going to get into this game as well.
And, um, I don't know. I feel like, uh, the cybersecurity community is doing a great job. Yeah. It's definitely driven by money. And, but I think there is good people that are driven by, wanted to do something for, for society as well. And it's always, always been like that. It's always been like that. And that's part of who we are.
So, Sean, thank you for stopping by. Thanks for letting me crash the party. Yeah, of course. It's, uh, it's Mi Casa Su Casa. And it's ITS Free Magazine. And Fred, always so great to have you. Um, from, uh, computer science to political science. I love these. Renaissance man. Um, you know, thinking. Uh, from many different perspective, which is what we'd love to do for everybody else.
Stay tuned, subscribe to Redefining Society, to the YouTube channel for ITSB Magazine. And, uh, I'm sure there'll be many more conversation, probably also with Fred. Happy to have you back.
[00:40:06] Fred Heiding: I'm sure there will be. I would love to come back. Thanks for having me again. And, uh, it's been, it's been great as always.
[00:40:12] Marco Ciappelli: Very cool. See ya. Thank you.
[00:40:15] Fred Heiding: Take care.