ITSPmagazine Podcasts

AI & Cybersecurity: Legal Risks and Solutions | A Conversation with Justin Daniels | The Soulful CXO Podcast with Dr. Rebecca Wynn

Episode Summary

In this episode, we dive into expert insights with Justin Daniels on treating cybersecurity as a strategic business advantage. Tune in to learn how businesses can stay competitive and resilient in today's digital environment.

Episode Notes

Guest: Justin Daniels, WSJ & USA Today Best Selling Author, Shareholder/Corporate M&A and Tech Transactions Attorney, Baker Donaldson [@Baker_Donelson]

LinkedIn: https://www.linkedin.com/in/justinsdaniels/

Host: Dr. Rebecca Wynn

On ITSPmagazine  👉  https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/rebecca-wynn

________________________________

This Episode’s Sponsors

Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/sponsor-the-itspmagazine-podcast-network

________________________________

Episode Description

In this episode of the Soulful CXO, host Dr. Rebecca Wynn sits down with Justin Daniels, a distinguished expert in corporate mergers and acquisitions, tech transactions, and cybersecurity, who also co-chairs the blockchain and digital assets practice at Baker Donelson. Together, they explore how businesses can gain a competitive edge by treating cybersecurity as a strategic risk rather than just a technical issue. Justin emphasizes the importance of robust disaster recovery plans and modern IT infrastructures to ensure resilience, discusses the implications of recent events like the Delta outage, and highlights the need for government-private sector collaboration to secure critical infrastructure. The conversation also covers executive accountability for cyber hygiene, effective vendor management, and the evolving landscape of cyber liability insurance and AI due diligence in M&A deals. Listen in to learn how these insights can help business leaders stay ahead in today’s fast-paced digital environment

________________________________

Resources

Data Reimagined: Building Trust One Byte at a Time: https://www.amazon.com/Data-Reimagined-Building-Trust-Byte-ebook/dp/B0BDVQ97YQ/

NIST CSF v2.0: Simplified Cybersecurity Guidance: https://www.linkedin.com/pulse/nist-csf-v20-simplified-cybersecurity-guidance-wynn-the-soulful-cxo-efvvc/

NIST AI Risk Management Framework: https://www.nist.gov/itl/ai-risk-management-framework

EU AI Act: https://artificialintelligenceact.eu/

Colorado AI Act: https://leg.colorado.gov/bills/sb24-205
________________________________

Support:

Buy Me a Coffee: https://www.buymeacoffee.com/soulfulcxo

________________________________

For more podcast stories from The Soulful CXO Podcast With Rebecca Wynn: https://www.itspmagazine.com/the-soulful-cxo-podcast

ITSPMagazine YouTube Channel:

📺 https://www.youtube.com/@itspmagazine

Be sure to share and subscribe!

Episode Transcription

AI & Cybersecurity: Legal Risks and Solutions | A Conversation with Justin Daniels | The Soulful CXO Podcast with Dr. Rebecca Wynn

Dr. Rebecca Wynn: [00:00:00] Welcome to the Soulful CXO. I'm your host, Dr. Rebecca Wynn. Before we jump in, please like, subscribe, and share the show. Today's episode is packed with insights that could give your business a competitive advantage or an edge. We'll discuss how to turn tech insights into real advantages, where cybersecurity should be seen as a strategic risk, and how to avoid costly mistakes in M& E deals.

Our special guest today is Justin Daniels, shareholder at Baker Donaldson, where he specializes in corporate mergers and acquisitions and tech transactions. As co chair of the firm's blockchain and digital assets practice, Justin helps businesses navigate cybersecurity risks throughout their lifecycle.

Recently, he was recognized by Super Lawyers for 2024, and his expertise is widely respected. Beyond his legal work, just as a frequent media expert on privacy and security, appearing on Scripps News, Fox 5 DC. [00:01:00] He's a sought after speaker at conferences like RSA, co host of the She Said Privacy, He Said Security podcast, and co author of the Wall Street Journal and USA Today bestselling, Data Reimagined: Building Trust One Bite at a Time. Justin, it's great seeing you again. Welcome to the show. 

Justin Daniels: How are you today? 

Dr. Rebecca Wynn: I'm doing great. You know, a lot of people might not realize on how much there is going on today in the world with technology and all these cases with CrowdStrike and SolarWinds and mergers and acquisitions and things along those lines.

And that's your specialty. So can you tell us a little bit about what it means to be a lawyer in that area and how you even got started to have that to be your focus of your practice? 

Justin Daniels: so I have been, started out a corporate M& A lawyer and I would say around 2010, I started to focus in tech because where I live in Atlanta has become like a real tier one tech market in the [00:02:00] last probably 10 years.

But when it comes to cybersecurity, the focal point for me was November of 2014, right after the Sony hack, I hosted the Israeli version of Steve Jobs who came to Atlanta for one day. And at the end of the day, he said, you know, you really ought to think about getting the word out. You really have an interesting cybersecurity community here in Atlanta.

And that's when I realized, Oh, the cybersecurity thing, this is a pretty big deal. This probably should be something that I should focus on because it overlays Every area of technology, and now as we fast forward ahead 10 years to 2024, as you pointed out, cybersecurity is a strategic business enterprise risk.

It overlays everything, and the CrowdStrike situation is just the latest data point that really drives home the point that we can't have a technology enabled society without really thinking through how we're going to deal with the cybersecurity issues that come with deploying all these different [00:03:00] technologies.

Dr. Rebecca Wynn: You just recently wrote a post, I know you've been on some other, shows as well, about your piece that you wrote on Delta and CrowdStrike and what's going on there now. We should think differently. Can you walk us through,a little bit of that on what you think is the importance Of that case.

And what we can take is not only as tech leaders, but business leaders, because the one thing is if you don't learn from your past, you repeat it. 

Justin Daniels: So the post came to me because, I actually experienced the repercussions of the outage and I saw families who were stranded for days, people sleeping in concourses, it was just.

awful. And then you watch Delta write a letter that's talking about CrowdStrike. CrowdStrike writes a letter, you know, blaming Delta. And I'm thinking to myself, that's all well and good, but the real issue here is, what about resiliency? [00:04:00] Basically what happened with that outage is that could be looked at as this could have been a potential cyber attack that took down the airline industry for multiple days, and Delta had even bigger issues.

So if I'm the Delta CEO, one of my first questions should be, how did we have such a significant issue with getting back to business? That goes to what is your disaster recovery plan? But to me, a key question is, the crew software for Delta didn't function correctly. They couldn't get it back up, they couldn't get crews to planes, so without crews, planes go nowhere.

So the question I want to know is, is that software something that was a legacy software that was old, outdated, and they hadn't spent money to modernize it? Because that is a key point, because the reason that healthcare industry has the highest incident of ransom payments and breach costs is because there's so many legacy software systems that are still used in healthcare that can't be patched, that [00:05:00] can't be supported, and what you're seeing with this Delta situation is Where are the incentives for companies to spend properly on their data infrastructure?

It's really no different than Watching bridges come crumble all across the country and we don't fix them until of course they've crumbled We have the same issue with our IT infrastructure and my thought is the difference between a modern technology abled society and the dark ages when you're solely relying on technology without a plan B, is pretty darn thin. 

Dr. Rebecca Wynn: How do you think we might be able to handle that better? I know one of the arguments out there consistently is, should the airlines, because they're part of our critical infrastructure, Should they be really part of a bigger government, , program, same with the telco situations and we've had telco go offline as well.

Do you have any thoughts on that or your peers have any thoughts on that? And where that should, where's that liability lies? Is it with the government? Because that literally takes the government, I [00:06:00] mean, takes the country offline. Or should that be solely on an airline? In this case, we're talking about Delta, but it could be Southwest, it could be numerous of them.

Justin Daniels: So, I think the past is prologue. And so, let's answer your question by stepping into a time machine and let's go back to 2008 when Wall Street, without government intervention, was about to crash the global economy. So, the government came in, they bailed him out. But my question for you is, is how many of the real people behind it actually got prosecuted?

Nobody. And so, to me, if you really wanted to change things, you'd have real consequences for companies or executives who don't. manage cyber hygiene appropriately. One of the interesting, I think, unintended but real consequences of the SEC cyber rules is look who's being held accountable when companies have bad [00:07:00] cyber hygiene.

The CISO. While they may be in charge of security, the person giving them their budget is usually the CEO. But yet, when all the stuff with SolarWinds happened and other things, it wasn't the CEO who got prosecuted, it was the CISO. So now if you're a, a well pedigreed security professional, you have to really think twice about whether you want to take a CISO role at a publicly traded company.

So I veered off on those data points to bring us back to your question, which is, I think if you leave the private sector to its own, the whole point of an airline or Google is to make profit, to bring a return to their shareholders. So, managing privacy and security are costs, and they're not exciting, and they undermine getting customers, so you kind of know where they're going to stand left to their own devices.

At the same time, when you have government monopolies like public [00:08:00] utilities, we've seen something different. I think an approach that I would take is some kind of hybrid approach like what we're seeing with information security analysis, the ISACs, and finding ways to foster more and more collaboration between government and the private sector.

But I also think We need some regulation along the lines of how we have regulations around the kinds of safety devices that you put in cars. When, you know, my parents grew up, they didn't wear a seatbelt, but now you and I are ensconced in airbags, we have sensors, we have all these things to make cars safer.

Are they 100 percent safe? No, but they're a lot safer. How do we get smarter and make cybersecurity the digital seatbelt of the 21st century? But we have to get there faster because as you see, Now, with the onset of AI technology is getting ever faster and it really overwhelms our ability and especially those of our legislatures to really comprehend the consequences of all of this [00:09:00] technology.

And I just don't buy the technologists who are telling us, Oh, this is going to be all great sunshines and rainbows. Cause I think they minimize the risks. You've seen what's gone on with, OpenAI and Sam Altman. I struggle to really accept some of the things he's trying to sell in light of what happened with social media and the actions that they take to minimize the people who are saying, hey, what about privacy and security and the risks?

Dr. Rebecca Wynn: That really is a challenge. I agree with you. I think the SEC rules are a lot of them are outdated. I just recently did that article on SEC and SolarWinds and what CISOs can learn on that up on LinkedIn. But it is a challenge because you see that the CISOs immediately get called out. Not too many people go ahead and do the articles or the follow up afterwards.

You said they did their due diligence, SolarWinds did their due diligence, but the guidelines even from the government, I mean, we just had an AI framework, we just signed an,executive order that we should have a cyber security, policy in [00:10:00] place, but we as United States, for example, we're not leading the world in actually how you should be doing these effectively.

It's really the European Union, who's doing that. And even then they're having a hard time going ahead and chasing these companies because it's really about business liability, you know, if you go ahead and you find us at fault and you go ahead and you take us to the court, it's going to be several years.

And even if you fine us whatever, a billion dollars, we know that's going to come down pennies in the dollar. So how do we manage that? Because it is a business risk that they're taking in deterring how they're going to pay, fines if they do ever come down, 

Justin Daniels: I think in the United States, we are paying the price for the, our Congress abdicating their role of putting guardrails around the 21st century digital economy, because you look with data privacy, cybersecurity and AI, all of the laws that are making a headway are all on the state level. We have 52 breach notification laws. [00:11:00] We have now 2 states with AI laws. I think we have 19 states with privacy laws.

All of this should be done at the federal level like we do with copyrights and trademarks because all you're going to do is create a labyrinth of all of these complicated regulations that does stifle innovation when the regulation really needs to take its course at a federal level. every time you watch another data breach and the senator pounds his or her fist and says, these data breaches are awful, you know, we should hold people accountable and they do nothing.

So, when is Congress going to get their act together and actually pass some federal privacy legislation, some federal AI legislation, or cyber legislation? It's ridiculous, but that's what's gone on here for the last, oh, I don't know, 14 years, and it doesn't appear to be any end in sight.

They just talk a good game, they're not willing to actually step up and legislate. 

Dr. Rebecca Wynn: I think part of that too is reaching out more to the [00:12:00] private sector to go ahead and to get our public sector to go ahead and say that, how can you help us forget about somebody if what party line they are, but what do they know in the industry security, privacy and cyber security?

Because us as a field. We want to keep the bad guys at bay and we don't look at political affiliations when we do that. So it'd be really nice if they would actually go ahead and allow us to be on more committees to help, but committees that actually say that we're going to do this in 90 days, 180 days, not three years, four years, six years to "get err done," I think that would help a lot as well.

Justin Daniels: I mean, just think about it for a minute when you watch TV and you watch that they have an AI summit at the White House with President Biden and they invite Sam Altman, they invite Google, Meta, Anthropic, all of the big companies who are looking to engage in a race to see who's going to win the AI race and have dominant market share and they're going to show up in the ones and you expect the to win.[00:13:00]

Wolf to run the hen house. I mean, it's just on its face. It's like, it's absurd. They don't have an interest in having regulation that actually regulates. They have their interest in putting on a show that yes, we understand there needs to be regulation, but it's against their business interests. So to think that they're somehow going to magically change and do the altruistic thing when this is probably the biggest revolution since the internet, I think it's just absurd.

Dr. Rebecca Wynn: I know for the bigger companies, , their cyber liability insurance, if they do it, it's a huge, humongous deductible because they have all the departments in place. But for the medium and the smaller companies and startups, the one thing I've seen is the cyber liability insurance are really coming in with deeper assessments.

What is your AI architecture looks like? What is your network architecture look like? How are you actually really training your employees? We are actually at least going to do a web penetration test of you to see if your website still have all the protections in place every three [00:14:00] months. And we're looking at if you're going to have AI, what's that additional cost, right, or is it, do you think people like cyber liability insurance are going to have to take more of an ownership to try to help companies get there, at least on the medium and small and startup companies? 

Justin Daniels: I think what's changed in the insurance market in the last, I don't know, three to five years is if you don't use MFA, if you don't have certain cyber hygiene in place, you're just uninsurable by 80 percent of the market. So that is a good example of economic incentives because you can't do contracts.

You can't do business deals without showing the other party that you have an insurance certificate. And so if you want that insurance certificate, the insurance industry is influencing you, I think here in a positive way, to have certain things in place. If you're a business operating in 2024 and you aren't using multi factor authentication to have your employees sign on and whatnot, I think they ought to pass a law that says that's on its face, gross negligence, where we sit.

Is MFA a cure all for [00:15:00] everything? You know that it's not, but is it a blocking and tackling thing that every company should be doing, much like putting on your seatbelt when you get in your car? Yes, it should.

Dr. Rebecca Wynn: Who do you think should be held liable on that? I know one of the things is when you're doing a lot of these corporations and the people who are the CEOs who also could be co founders, I mean, let's face it, they're behind a lot of sub companies and so their personal direct liability is nothing compared to, being a CISO, , I always have to watch about how much I'm going to get hung out to dry, to be honest with you and my peers.

So how do you think we tackle that a little better on the people who are really making at times their strategic calls,, How do we hold them liable when they're under so many shields of protection?

Justin Daniels: I think it depends on having a public policy debate and saying, where are we going to allocate the risk?

Like for example, with the EU AI act, they are looking to hold responsible the people who are creating the AI or deploying it. The Colorado act is a variation on that theme. When they talk about the [00:16:00] developers and the deployers. Because you and I could use a large language model and tweak it and brand it ourselves.

So those are things that we have to have a debate about because the business community is going to push back and say, you're stifling innovation, you're doing all that. But, you know, look at all the negative and adverse consequences around social media. So the question becomes, does that pendulum need to swing back?

Do you need to have a law that says, look, if we can prove that the company was grossly negligent, should a CEO be held liable because a gross negligence As you're probably audiences learning with CrowdStrike, that's a really high bar. I don't think it can be negligence because it's too low, but having it be gross negligence or something higher, then you start to see people change behavior.

It's just something to think about, but the real issue is, is the way it works now is we have a situation where companies aren't liable. They hide behind a shield. [00:17:00] they can always file for bankruptcy. So, if you have really significant cases. Where liability is for the individual, I think it changes things.

Had they perp walked two or three people from 2008, I think you would have seen a whole different reaction to the consequence of the 2008 financial crisis, but we didn't. So if I'm a CEO, I'm thinking, Oh, I can take all the risks I want because at the end of the day, if I bankrupt the company, that's other people's money.

I don't have personal liability. But again, had they perp walked some of the leaders of JP Morgan, pick your investment bank. Cost all that we have a fundamentally different situation.

Dr. Rebecca Wynn: What do you think that's going to change whenwe do vendor assessments and downstream liability, because if you go with CrowdStrike, they're top brand when you went to Solarwinds at the time, it's top brand, right? If you go ahead and you have for your telco, T Mobile a lot of people are going, well, we picked, the top three and we looked, who gave us the best price.

We did our due [00:18:00] diligence along those lines. So why should we, as a company, be look like we were not doing our job? 

Justin Daniels: I think the problem that companies face is the bottom line is there's no company that is risk free from being hacked. Then it comes down to the kind of terms that you get in your contract, which is, you can tell what the CrowdStrike situation, Reading the two letters, it sounds to me like CrowdStrike doesn't have liability unless they're grossly negligent.

It's a standard contract term for cyber vendors, just like other ones. It's a high bar to prove and CrowdStrike's arguing going to be, even if we did something, look at all of the other things that Delta had to deal with, and that's on Delta, 

it'll be interesting to see where that shakes out if they file suit or if they settle, because obviously they both have to work together. It's not so easy to get CrowdStrike out. But to your question with vendor management, I think companies need to take it more seriously. If you ask me, the whole point of [00:19:00] the SEC regulation around cybersecurity was aimed at trying to manage third party risk through regulating publicly traded companies, because so many of them have all of these, you know, Thousands of vendors in their ecosystem and most small and middle sized companies don't have the inclination or they don't want to dedicate the resources to really having any kind of third party risk management program.

Dr. Rebecca Wynn: Yeah, I tell a lot of times you'll talk to people and I'm like, well, they had a SOC 2 or if it was in healthcare, they had HITRUST. And I'm like, you have to look at the scope. What was tested? A lot of that time, I'm one of those few people who reads those 300 page assessments. 

And I'm like, what did the policy say? And were they putting in place known documented Implemented, measured, and managed. Are they doing those five areas? And when we look at a lot of the hearings, they're not looking at those five elements , they look to see the data policy, they follow it, but they don't look to that, that follow through.

I think if they would follow those five core areas, it's not only known, but documented, implemented, measured, and managed. And how are you doing that day in, day out? I think [00:20:00] that also could go ahead and curb some of these, from being as grandiose as they are. 

Justin Daniels: I mean, do I see SOC 2 all the time? Yes, because from a table stakes perspective, companies require them, but I've dealt with several data breaches where, Oh, we're SOC compliant.

Well, why didn't use multi factor authentication for your remote workforce, who's accessing your server. That was the root cause of how the threat actor got in. So SOC 2 didn't prevent any of that. And so I think people conflate SOC 2 with compliance and it's just not. a review of your controls at a place in time that you have the opportunity to prepare for, much like an audit of your financial statements and all that other stuff, and people misunderstand that as being compliant with having security, and it's just not.

Dr. Rebecca Wynn: Along with controlling the scope of what, what entails in those reports and how they're being tested in some [00:21:00] respects as well, because there's a lot of, you can test it this way or this way, or this way. So I tell people you have to really pay attention and never do business with someone solely because they have some sort of certification, is my, always my advice to them, how do you, how would you advise.

People who are out there who like, you know what I, I do go ahead and I lead up our vendor management. And now that I do see the last couple of years that every time I try to go ahead and follow a standard checklist on can I do business , with these people, you know, we ended up getting breached. I think one of the things is that you really need to look close, look at those assessments, but you have to look at your contract.

Are they just saying best efforts? Are they saying best practices? How are they wording in the contract to hold themselves liable? 

Justin Daniels: Yeah, so one area that I see more companies having to enact better cyber hygiene. So if you want to do business with one of the bigger companies that's publicly traded or just, you know, bigger, now they're requiring you to go through a pretty significant [00:22:00] third party management process for cybersecurity.

So now I've had several clients come to me who, Hey, we're not getting this business deal unless we get our cyber house in order. So when I had mentioned to them in the past and it was like, all right, we'll get to it now facing the prospect of losing business. They're now saying, yep, I got to do this, which is why I'm saying there's ways economically we identify to bigger companies.

And getting insurance where you can put in economic incentives in place and say, hey, if you want this business and get this revenue, you have to upgrade your cyber hygiene. But we know at a big company level, that's not enough. And that's where some of the other stuff that I was talking about might come into play to get companies bigger incentives because if they're grossly negligent.

Maybe the CEO is personally liable, capped at a certain amount. Now you're starting to get ideas out there that maybe start to change behavior. But we as a society need to have that debate because [00:23:00] people are going to say, well, wait a second, I didn't build my life savings to run a company and get sued personally.

Well, make the standard pretty high because I think you've seen in multiple different breaches, you're like, what did these people do? And maybe that's a risk that we should consider putting on to companies, because my issue is, is I don't know how you're going to get for profit companies to change their tune when the whole point is shareholder return, market share, cybersecurity and privacy are inconvenient, they're just not going to be at the forefront unless you have an incentive or a requirement to do it, and you are seeing both of them in these economic contexts, and then you have them in laws.

Dr. Rebecca Wynn: I think all these cases and points remind people that businesses are in business to make money and having a breach or something along those lines, there's an economic cost. And they do calculations on enterprise risk management on if we were going to go [00:24:00] ahead and change out this legacy equipment or this other equipment, what's it going to cost?

And then what's the likelihood of something happening? And if that likelihood happens, what is the cost outlay to us? it's a trade off. And I think part of the thing is people forget about that. It's all no companies out there to make sure you as an individual is a hundred percent secure, a hundred percent privacy at all points in time.

You got to take ownership on yourself as well. 

Justin Daniels: That's true, which requires a shift in mindset, meaning when you're out and you're at the restaurant and they say, Hey, give us your email and we'll give you a receipt. I politely decline because I'm like, why do I want to give them that so they can market to me?

Or you go to one of those digital kiosks that'll take a selfie of you and your family wearing a funky hat. It's fun. But how do you get the hat? You have to give them your mobile number. Well, what is the digital kiosk company doing with your mobile number? So, you're right. People have to really start to take ownership, but that requires changing your mindset.

You need to treat your digital life [00:25:00] much like your physical life and have keys, And locks and really be thoughtful about who you're going to let into your digital persona with what information you're going to give out. I'm still surprised by how many people put their birthday on Facebook and put all this personal information about themselves.

Now, any hacker who wants to learn more about you, you've just given it to them. There's nothing to do other than go research their Facebook, LinkedIn page and everything else. So that's why I always use Tiger Woods's birthday. So my birthday is December 30th of 1975. It's Tiger Woods's birthday.

Dr. Rebecca Wynn: Yeah, I would say none of that's true on myself either and you even know on LinkedIn I tell you don't use that email address because the email address is a dump box and even if you try to send there says this is a dump box 

but yeah people need to watch that and businesses need to watch that too. The one thing I tell people is really be careful of the terms of service and privacy and things along those lines. One thing that I suggest people there is go through the apps on your phone.

When was the last time you used any of [00:26:00] those apps? Delete them. I used them seven years ago. Well, they have your data on that. Go delete them. I think companies need to do that as well. When you look at the vendor management, you know, who we do in business with, why are we doing business with them?

When's the last time we really did a deep dive in their security and privacy and practices, or are we just renewing this contract every three years without looking to see that they've actually upped their game as well? Do you see that from a legal perspective as well, too, with your clients?

Justin Daniels: I think it depends on clients who have had cybersecurity issues or data breaches.

If I'm dealing with someone who hasn't had any of that, they tend to not think about it as much. But if I'm dealing with someone who has had experience with a data breach, they tend to think about that very differently. I'm still very much surprised, particularly on M& A deals, on how little due diligence gets done from a cyber perspective.

It really varies from company to company, and Even the big PE firms, there's some that do [00:27:00] good due diligence and there are others. I'm just scratching my head. Is that all you're going to do? But I represented the seller. It wasn't my job to tell them. But what are you going to do if you hook up a mom and pop car wash syndicate to your reporting requirements and there's an undetected cyber intrusion?

Well, now they're just going to walk onto your network and encrypt you. Because they're able to because you didn't do any due diligence or anything like that. I see that still pretty often on the M& A part of my practice. 

Dr. Rebecca Wynn: Yeah, I do that myself where I get called in by legal firms and I get called in for private equity VCs , what did we miss? I'm like, there's a whole heck of a lot you miss. And then it's talking about mitigating damages, right? which side really didn't do more due diligence. You just assume that you could give your cyber person one hour to look at privacy compliance and everything would be okay because in our case when we just said you had a SOC 2 report or some other report. 

Justin Daniels: I think you're going to see a publicly traded company get sued [00:28:00] by their shareholders because they did an M& A deal, didn't do adequate diligence. And there was some cyber intrusion into the purchase companies network that impacts them and relying on reps and warranties and not showing you did diligence.

I don't think it's going to fly. At the same token, I've had CISOs call me and say, Hey, we're about to sell our company. And should we do a pen test on our website? Because that's the asset we're selling. And I'm like, well, would you rather know now, or would you rather get the question and diligence?

Because if I'm the buyer, if I'm representing the buyer for That kind of thing. I want a copy of their most recent pen test. If they don't have one, that tells me a lot about their lack of attention to cyber hygiene. So that could impact what I asked for. It could impact the deadline to get the deal done.

It could require a larger hold back. It could require an adjustment to the purchase price because I have to spend more to get there. Cyber hygiene up to snuff. And so, you know, I've been talking to some, investment banks to [00:29:00] say, Hey, when you're prepping these guys for sale, you have a whole list of cyber diligence and hygiene you need to do before you put them up on the market.

Dr. Rebecca Wynn: One thing I see along those two that I advise, I see a lot of that paperwork is still missing artificial intelligence., what is your architecture for AI?

What are the API calls? How are you tagging data that you're allowing to go outside your company to be able to use in those, depositories? How are you going ahead and watching the data come in? Because you're looking at downstream, possibly upstream liabilities. 

It's one thing I see is missing. Is The AI due diligence around that for mergers and acquisitions. 

Justin Daniels: AI diligence for mergers and acquisitions. I haven't seen it yet. Let alone having companies really document well, how they're, deploying AI and use cases, let alone vetting third parties who are including AI as part of their service offering.

So I'm seeing inadequacies in [00:30:00] all of those areas, which isn't surprising because again, the technology is way out in front of our ability to comprehend There's really no regulatory guardrails, although to be fair, the advice I've been giving is we need to look at the Colorado Act. We need to look at the EUA act to see how regulators are thinking about that because The NIST AI Risk Management Framework also gives you a really good holistic approach to developing a good AI compliance program, but most companies are so worried about their competition and deploying stuff.

Is it any wonder why you see chatbots selling a Ford F 150 for 2 or some of the other funny stuff we've seen go on? It's because people are in a rush. They don't want to be left out, so they, , don't really think about privacy, security, the other issues. And those are the things that end up biting them.

Dr. Rebecca Wynn: Unfortunately, our time is totally flown by. I want to thank everybody for joining us for this episode. Please go ahead and look at the descriptions. You'll have Justin's information and have [00:31:00] resources links. 

Please go ahead and make sure that you like subscribe and share the show as well as the Soulful CXO Insights Newsletter, Justin, thank you so much for sharing your insights from being on the show today. 

Justin Daniels: Thank you.