ITSPmagazine Podcast Network

The Impact of AI | A Conversation with Diana Kelley | On Cyber & AI Podcast with Christina Stokes

Episode Summary

Tune in to this On Cyber & AI Podcast episode hosted by Christina Stokes for a conversation about the impact of AI on the cybersecurity industry and beyond with expert and CISO, Diana Kelley.

Episode Notes

Guest: Diana Kelley, CISO, Protect.AI [@ProtectAICorp]

On LinkedIn | https://www.linkedin.com/in/dianakelleysecuritycurve/

______________________

Host: Christina Stokes, Host of On Cyber & AI Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/christina-stokes

______________________

This Episode’s Sponsors

Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/sponsor-the-itspmagazine-podcast-network

______________________

Episode Description

Diana and Christina discuss how ML and AI have evolved and what the impact has been since the explosion of AI in recent times. Diana discusses AI technology, its growth and adoption, as well as what we can anticipate in the future. Diana also shares what those interested in AI and Cybersecurity can do if they are interested in pursuing careers in those spaces.

______________________

Resources

______________________

To see and hear more of On Cyber & AI Podcast with Christina Stokes content on ITSPmagazine, visit: https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/christina-stokes

Watch the webcast version on-demand on YouTube: https://www.youtube.com/playlist?list=PLnYu0psdcllTLfC_TKdcM1qGOpplw0lNT

Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/sponsor-the-itspmagazine-podcast-network

Episode Transcription

The Impact of AI | A Conversation with Diana Kelley | On Cyber & AI Podcast with Christina Stokes

0:00:02.232 --> 00:00:03.734

Thank you for joining ITSP

 

00:00:03.754 --> 00:00:06.276

Magazine's On Cyber and AI podcast.

 

00:00:06.476 --> 00:00:07.837

I'm your host, Christina Stokes,

 

00:00:08.038 --> 00:00:10.160

and today's guest is Diana Kelly.

 

00:00:10.680 --> 00:00:13.303

I'd love for you to tell us about yourself,

 

00:00:13.323 --> 00:00:13.623

Diana.

 

00:00:14.722 --> 00:00:16.544

sure i am well first of all

 

00:00:16.564 --> 00:00:17.524

thanks for having me here

 

00:00:17.684 --> 00:00:18.925

and congratulations on your

 

00:00:18.966 --> 00:00:20.567

new podcast thank you and

 

00:00:20.707 --> 00:00:23.149

then i have been in i.t for

 

00:00:23.229 --> 00:00:24.650

well over three decades

 

00:00:24.690 --> 00:00:25.431

which is kind of like

 

00:00:25.471 --> 00:00:26.812

mind-blowing when it's like

 

00:00:26.852 --> 00:00:27.993

where did the years go

 

00:00:28.433 --> 00:00:30.235

somewhere um but so i've

 

00:00:30.255 --> 00:00:31.135

seen i've seen a lot of

 

00:00:31.196 --> 00:00:32.497

different technology over

 

00:00:32.557 --> 00:00:34.158

the the course of my career

 

00:00:34.778 --> 00:00:36.660

and i as many people did

 

00:00:36.720 --> 00:00:37.561

started to get really

 

00:00:37.641 --> 00:00:39.843

interested in ai and how ai

 

00:00:39.883 --> 00:00:41.384

was going to change the landscape for us

 

00:00:41.704 --> 00:00:42.704

Starting back when I was at

 

00:00:42.865 --> 00:00:45.046

IBM and we were training

 

00:00:45.086 --> 00:00:47.647

Watson for cybersecurity as

 

00:00:47.727 --> 00:00:49.147

part of what the team was

 

00:00:49.207 --> 00:00:50.808

doing to help others be

 

00:00:50.868 --> 00:00:52.409

able to take advantage of AI.

 

00:00:52.449 --> 00:00:53.910

And then when I was at Microsoft,

 

00:00:53.930 --> 00:00:54.970

I continued to be really

 

00:00:55.090 --> 00:00:56.371

interested in this space

 

00:00:56.431 --> 00:00:57.651

and trying to understand the work,

 

00:00:57.731 --> 00:00:58.712

especially around ethics.

 

00:00:59.528 --> 00:01:01.710

which has brought me to where I am now,

 

00:01:01.750 --> 00:01:02.971

where I am the CISO.

 

00:01:03.051 --> 00:01:04.793

So I'm doing my security,

 

00:01:05.153 --> 00:01:07.515

but at a company called Protect AI,

 

00:01:07.635 --> 00:01:09.077

where we are focusing on

 

00:01:09.137 --> 00:01:11.559

creating tools to secure

 

00:01:11.579 --> 00:01:13.401

and creating a platform to

 

00:01:13.441 --> 00:01:15.342

secure the machine learning

 

00:01:15.423 --> 00:01:18.505

operations lifecycle.

 

00:01:18.766 --> 00:01:20.107

What motivates you every day

 

00:01:20.187 --> 00:01:22.029

and why cybersecurity?

 

00:01:22.069 --> 00:01:23.390

Why have you stayed in this field?

 

00:01:25.335 --> 00:01:26.416

What motivates me every day

 

00:01:26.436 --> 00:01:29.358

is just really making sure,

 

00:01:29.558 --> 00:01:31.799

kind of like not letting

 

00:01:31.819 --> 00:01:32.579

the bad guys win.

 

00:01:32.800 --> 00:01:34.901

If I had to just distill it

 

00:01:34.921 --> 00:01:35.801

down to one thing.

 

00:01:36.001 --> 00:01:38.183

And it started because I got

 

00:01:38.223 --> 00:01:39.303

really fascinated in

 

00:01:39.343 --> 00:01:41.304

technology way back in the

 

00:01:41.785 --> 00:01:43.005

1970s and what could be

 

00:01:43.065 --> 00:01:44.346

done with collaborative

 

00:01:44.406 --> 00:01:45.527

computing and connecting

 

00:01:45.587 --> 00:01:46.948

people and allowing us all

 

00:01:46.988 --> 00:01:47.508

to work together.

 

00:01:47.868 --> 00:01:49.069

So I was just so excited by

 

00:01:49.129 --> 00:01:49.949

what could be done.

 

00:01:50.150 --> 00:01:51.350

By the time I was practicing

 

00:01:51.430 --> 00:01:52.151

professionally,

 

00:01:53.091 --> 00:01:54.712

networks had been joined together.

 

00:01:54.732 --> 00:01:56.192

Email was starting to become

 

00:01:56.232 --> 00:01:56.933

something that was

 

00:01:57.433 --> 00:01:58.774

commonplace in the workforce.

 

00:01:58.834 --> 00:02:00.174

So I know for a lot of people, you're like,

 

00:02:00.214 --> 00:02:01.595

there was a time before email.

 

00:02:02.075 --> 00:02:03.055

And it's like, yes,

 

00:02:03.095 --> 00:02:05.276

we would also ride our horses to work.

 

00:02:07.437 --> 00:02:08.398

But in any case...

 

00:02:09.458 --> 00:02:10.699

I had built and was very

 

00:02:10.799 --> 00:02:12.180

proud of a network for our

 

00:02:12.220 --> 00:02:13.161

startup that I was working

 

00:02:13.201 --> 00:02:14.521

for in Cambridge, Massachusetts.

 

00:02:14.702 --> 00:02:16.623

And this startup wanted to

 

00:02:16.643 --> 00:02:19.445

start distributing our patches using FTP,

 

00:02:19.505 --> 00:02:20.445

which was really kind of

 

00:02:20.465 --> 00:02:21.626

groundbreaking at the time.

 

00:02:21.666 --> 00:02:22.266

It was like, oh,

 

00:02:22.326 --> 00:02:23.607

instead of having to snail

 

00:02:23.627 --> 00:02:25.268

mail a patch on a disk,

 

00:02:25.629 --> 00:02:26.509

I know people are like,

 

00:02:26.669 --> 00:02:28.550

I swear there's a horse involved in here.

 

00:02:28.590 --> 00:02:31.472

So instead of snail mailing

 

00:02:31.512 --> 00:02:32.273

a patch on a disk,

 

00:02:32.313 --> 00:02:33.394

we could just allow our

 

00:02:33.454 --> 00:02:34.874

customers to come in and download it.

 

00:02:34.894 --> 00:02:36.375

So I was really proud that

 

00:02:36.395 --> 00:02:37.116

we had built that out.

 

00:02:37.634 --> 00:02:39.456

But someone attacked that

 

00:02:39.516 --> 00:02:40.917

server that we were using

 

00:02:41.037 --> 00:02:42.898

to do that and got onto my network.

 

00:02:43.779 --> 00:02:46.401

And it was at a very inopportune time.

 

00:02:46.421 --> 00:02:47.862

The first attack that we had

 

00:02:47.922 --> 00:02:49.444

was actually on a Christmas Eve.

 

00:02:50.804 --> 00:02:52.766

They love weekends and holidays.

 

00:02:53.146 --> 00:02:53.546

They do.

 

00:02:53.726 --> 00:02:54.687

They know that people are

 

00:02:54.727 --> 00:02:55.748

going home and that was it.

 

00:02:55.828 --> 00:02:57.049

We had started to go home

 

00:02:57.089 --> 00:02:57.969

and it was just me and a

 

00:02:58.009 --> 00:02:59.230

contractor and we had to

 

00:02:59.290 --> 00:03:00.331

stay that night until we

 

00:03:00.351 --> 00:03:02.192

had figured out where the

 

00:03:02.232 --> 00:03:02.953

attack was coming from,

 

00:03:02.973 --> 00:03:04.754

that we had not only

 

00:03:04.794 --> 00:03:06.175

eradicated it out of our systems,

 

00:03:06.195 --> 00:03:07.276

but we had made sure that

 

00:03:07.416 --> 00:03:08.217

we had prevented the

 

00:03:08.257 --> 00:03:09.218

problem so it couldn't be

 

00:03:10.078 --> 00:03:10.739

they couldn't start the

 

00:03:10.779 --> 00:03:12.841

attack again but i got kind

 

00:03:12.861 --> 00:03:14.043

of upset not just because

 

00:03:14.083 --> 00:03:15.204

it was christmas eve but

 

00:03:15.504 --> 00:03:16.426

what i really got upset

 

00:03:16.466 --> 00:03:17.427

about was that we had

 

00:03:17.467 --> 00:03:18.728

worked so hard to create

 

00:03:18.768 --> 00:03:20.750

this network and do good

 

00:03:20.791 --> 00:03:21.672

things with it for our

 

00:03:21.752 --> 00:03:22.913

company of our customers

 

00:03:23.273 --> 00:03:24.074

and then here were these

 

00:03:24.154 --> 00:03:25.296

attackers that were coming

 

00:03:25.416 --> 00:03:26.958

in and disabling us from

 

00:03:26.998 --> 00:03:28.059

taking you know the benefit

 

00:03:28.099 --> 00:03:29.200

of using our network and i was like

 

00:03:30.081 --> 00:03:30.741

Never again.

 

00:03:30.761 --> 00:03:31.762

That's not okay.

 

00:03:33.002 --> 00:03:34.022

This isn't your network.

 

00:03:34.102 --> 00:03:34.522

It's ours.

 

00:03:34.562 --> 00:03:35.723

You shouldn't be trying to

 

00:03:36.103 --> 00:03:36.783

take and break it.

 

00:03:37.503 --> 00:03:38.804

So that really is what gets

 

00:03:38.844 --> 00:03:40.264

me up every day and keeps

 

00:03:40.284 --> 00:03:41.524

me motivated is that I

 

00:03:41.764 --> 00:03:43.005

really do want to make sure

 

00:03:43.065 --> 00:03:44.725

that people can get all of

 

00:03:44.785 --> 00:03:47.066

the wonderful benefits of technology.

 

00:03:48.791 --> 00:03:51.953

in as safe and reliable a way as possible.

 

00:03:52.033 --> 00:03:52.413

Obviously,

 

00:03:52.433 --> 00:03:53.294

there's no such thing as

 

00:03:53.394 --> 00:03:54.215

absolute security.

 

00:03:54.695 --> 00:03:55.776

And that is partly why we

 

00:03:55.796 --> 00:03:56.936

have to do our jobs every day,

 

00:03:56.976 --> 00:03:57.717

but to make sure that

 

00:03:57.737 --> 00:03:59.998

people have as low risk as

 

00:04:00.098 --> 00:04:01.239

possible as they're getting

 

00:04:01.259 --> 00:04:02.520

the benefits of technology.

 

00:04:03.100 --> 00:04:03.660

Absolutely.

 

00:04:03.680 --> 00:04:05.722

Technology can be used as a

 

00:04:05.762 --> 00:04:06.802

tool or a weapon.

 

00:04:06.902 --> 00:04:09.404

So I love to hear that there

 

00:04:09.464 --> 00:04:11.305

are people working to keep

 

00:04:12.222 --> 00:04:13.783

it as a tool and prevent

 

00:04:13.823 --> 00:04:15.445

others from using it as a weapon.

 

00:04:15.465 --> 00:04:18.387

At Protect AI, you are the CISO.

 

00:04:18.487 --> 00:04:19.448

Is there anything you can

 

00:04:19.528 --> 00:04:21.530

share about what you're working on there?

 

00:04:22.799 --> 00:04:23.399

Yeah, I mean,

 

00:04:23.640 --> 00:04:25.121

the company itself is we're

 

00:04:25.141 --> 00:04:26.662

building out a platform for

 

00:04:26.722 --> 00:04:28.063

building security into the

 

00:04:28.103 --> 00:04:29.063

machine learning and

 

00:04:29.104 --> 00:04:30.484

security operations lifecycle.

 

00:04:30.525 --> 00:04:31.825

And a lot of times people will say,

 

00:04:31.865 --> 00:04:33.407

but it's machine learning AI.

 

00:04:33.907 --> 00:04:35.128

And a lot of generative AI

 

00:04:35.168 --> 00:04:36.769

is actually driven by machine learning.

 

00:04:36.809 --> 00:04:38.370

So as we secure the machine

 

00:04:38.390 --> 00:04:39.211

learning lifecycle,

 

00:04:39.491 --> 00:04:41.312

we create security within

 

00:04:41.672 --> 00:04:42.593

all of the different ways

 

00:04:42.633 --> 00:04:44.154

that those models are used,

 

00:04:44.254 --> 00:04:46.156

including generative AI.

 

00:04:46.236 --> 00:04:48.577

We also have tools that will

 

00:04:48.617 --> 00:04:50.259

help to protect

 

00:04:50.959 --> 00:04:52.520

interactions with generative

 

00:04:52.620 --> 00:04:54.901

AI once it's in production and in use,

 

00:04:55.721 --> 00:04:57.702

both from the person using it,

 

00:04:57.782 --> 00:04:58.703

making sure they're not

 

00:04:58.843 --> 00:05:00.564

asking too much of that

 

00:05:00.624 --> 00:05:02.184

system or giving it

 

00:05:02.224 --> 00:05:03.425

sensitive data that they shouldn't,

 

00:05:03.445 --> 00:05:03.945

for example,

 

00:05:04.325 --> 00:05:05.466

and also making sure that

 

00:05:05.506 --> 00:05:07.327

they're not attacking the

 

00:05:07.367 --> 00:05:08.848

system by getting the

 

00:05:08.888 --> 00:05:10.328

system to do or say

 

00:05:10.388 --> 00:05:11.529

something that it shouldn't.

 

00:05:12.529 --> 00:05:13.510

So there's, you know,

 

00:05:13.750 --> 00:05:14.891

but this all begins at the

 

00:05:14.931 --> 00:05:16.052

beginning of the lifecycle

 

00:05:16.092 --> 00:05:17.493

with our machine learning,

 

00:05:17.533 --> 00:05:18.693

with what models are we

 

00:05:18.733 --> 00:05:19.594

going to download?

 

00:05:19.834 --> 00:05:20.675

What data are we going to

 

00:05:20.715 --> 00:05:21.835

use to train those models?

 

00:05:22.136 --> 00:05:23.156

How do we train those models?

 

00:05:23.196 --> 00:05:24.017

Who trains those models?

 

00:05:24.037 --> 00:05:24.857

How do we label that

 

00:05:24.897 --> 00:05:26.258

training data if we are doing,

 

00:05:26.338 --> 00:05:27.619

you're doing supervised machine learning?

 

00:05:27.919 --> 00:05:29.100

So all of those pieces,

 

00:05:29.140 --> 00:05:30.381

and then we create a

 

00:05:30.441 --> 00:05:31.662

platform that enables you

 

00:05:31.702 --> 00:05:34.284

from the beginning within the lifecycle,

 

00:05:34.404 --> 00:05:34.644

you know,

 

00:05:34.684 --> 00:05:36.365

to create a machine learning

 

00:05:37.125 --> 00:05:37.966

bill of materials.

 

00:05:38.426 --> 00:05:39.547

And do software composition

 

00:05:39.607 --> 00:05:40.728

analysis for your machine

 

00:05:40.768 --> 00:05:41.769

learning models all the way

 

00:05:41.869 --> 00:05:42.950

out to when you're in

 

00:05:42.990 --> 00:05:44.711

production and using generative AI.

 

00:05:45.432 --> 00:05:47.214

What are one of the biggest

 

00:05:47.914 --> 00:05:48.975

misconceptions that you

 

00:05:49.015 --> 00:05:50.917

have seen or heard in

 

00:05:50.977 --> 00:05:53.679

regards to AI and what

 

00:05:53.719 --> 00:05:54.820

you're specifically working

 

00:05:54.900 --> 00:05:55.961

on in the space that you're in?

 

00:05:57.253 --> 00:05:57.393

Well,

 

00:05:57.413 --> 00:06:00.096

the biggest misconception I hear is

 

00:06:00.356 --> 00:06:03.139

that machine learning is, or AI,

 

00:06:03.319 --> 00:06:05.782

is so incredibly powerful

 

00:06:05.962 --> 00:06:07.224

and intelligent that it's

 

00:06:07.304 --> 00:06:08.525

going to be able to be

 

00:06:08.585 --> 00:06:09.966

powerful and intelligent in

 

00:06:10.127 --> 00:06:11.268

all different dimensions.

 

00:06:12.046 --> 00:06:15.589

of knowledge and that's not

 

00:06:15.669 --> 00:06:16.670

exactly the case.

 

00:06:17.650 --> 00:06:21.173

If we're talking about mathematics,

 

00:06:22.935 --> 00:06:23.835

computers are going to be

 

00:06:23.875 --> 00:06:24.916

able to do things that are

 

00:06:24.936 --> 00:06:27.138

better than people in a lot of cases.

 

00:06:27.238 --> 00:06:27.858

For example,

 

00:06:28.179 --> 00:06:29.540

if I gave you two 20-digit

 

00:06:29.580 --> 00:06:30.921

numbers right now to multiply,

 

00:06:31.621 --> 00:06:32.962

or gave me, right?

 

00:06:33.022 --> 00:06:34.183

Both of us would take quite

 

00:06:34.203 --> 00:06:35.244

a while with pen and paper

 

00:06:35.264 --> 00:06:36.265

trying to do that manually.

 

00:06:36.305 --> 00:06:37.306

A calculator is going to be

 

00:06:37.346 --> 00:06:38.947

able to do that very, very quickly.

 

00:06:39.508 --> 00:06:41.329

And if it's been tested and

 

00:06:41.349 --> 00:06:42.190

deployed properly,

 

00:06:42.270 --> 00:06:43.471

it's going to be able to do

 

00:06:43.511 --> 00:06:45.192

it with a very high level of accuracy.

 

00:06:45.552 --> 00:06:46.633

Human beings aren't also

 

00:06:46.713 --> 00:06:48.054

that great sometimes with our math,

 

00:06:48.114 --> 00:06:48.314

right?

 

00:06:48.355 --> 00:06:48.855

Check the math.

 

00:06:50.956 --> 00:06:52.017

But there are other things

 

00:06:52.037 --> 00:06:52.997

that humans do that we

 

00:06:53.037 --> 00:06:54.618

don't necessarily count as

 

00:06:54.698 --> 00:06:55.779

part of intelligence,

 

00:06:56.199 --> 00:06:58.000

but actually do take quite

 

00:06:58.080 --> 00:06:59.500

a lot of intelligence.

 

00:07:00.061 --> 00:07:02.442

So I use this example a lot,

 

00:07:03.002 --> 00:07:03.822

which is to pick up

 

00:07:03.882 --> 00:07:05.183

something and pick up a

 

00:07:05.223 --> 00:07:06.744

vessel and drink from it.

 

00:07:07.804 --> 00:07:09.625

This actually, to be able to pick this up,

 

00:07:09.685 --> 00:07:10.945

I mean, this could have been in a glass.

 

00:07:11.025 --> 00:07:12.065

I use glasses a lot.

 

00:07:12.105 --> 00:07:13.446

Sometimes I pour this in.

 

00:07:13.826 --> 00:07:15.546

The amount of liquid that's in here,

 

00:07:15.586 --> 00:07:17.167

I actually can't see this right now.

 

00:07:17.507 --> 00:07:18.527

This could be full or it

 

00:07:18.567 --> 00:07:19.468

could be almost empty.

 

00:07:19.808 --> 00:07:20.548

If it was full,

 

00:07:21.028 --> 00:07:22.609

I need to create a

 

00:07:22.669 --> 00:07:24.109

different amount of grip strength.

 

00:07:24.469 --> 00:07:25.409

I need to put a different

 

00:07:25.469 --> 00:07:26.810

amount of force in my arm.

 

00:07:27.050 --> 00:07:28.450

There's a whole lot, bottom line,

 

00:07:28.490 --> 00:07:29.931

there's a whole lot of math

 

00:07:30.051 --> 00:07:32.372

that goes on with something very simple,

 

00:07:32.412 --> 00:07:33.612

but because human beings do

 

00:07:33.652 --> 00:07:35.753

this every day without thinking, we go,

 

00:07:36.133 --> 00:07:36.833

eh, that's easy.

 

00:07:37.233 --> 00:07:38.977

Well, it's not necessarily easy.

 

00:07:39.037 --> 00:07:40.319

Now think about training an

 

00:07:40.740 --> 00:07:42.464

AI-driven robot to be able

 

00:07:42.504 --> 00:07:44.488

to pick up any kind of

 

00:07:45.069 --> 00:07:46.873

vessel and know the exact

 

00:07:46.913 --> 00:07:47.734

amount of strength to

 

00:07:48.692 --> 00:07:51.374

to bring it up to drink, right?

 

00:07:51.875 --> 00:07:52.696

So that that's it,

 

00:07:52.796 --> 00:07:54.317

is that we don't

 

00:07:54.417 --> 00:07:56.699

necessarily get down to the

 

00:07:56.759 --> 00:07:58.200

levels of what intelligence is.

 

00:07:58.240 --> 00:07:58.721

And sometimes we

 

00:07:58.961 --> 00:08:00.442

overestimate how hard something is,

 

00:08:00.462 --> 00:08:02.064

because it's hard for us, i.e.

 

00:08:02.084 --> 00:08:02.424

man.

 

00:08:02.864 --> 00:08:03.865

And we underestimate

 

00:08:03.905 --> 00:08:04.646

something that's actually

 

00:08:04.666 --> 00:08:05.367

pretty difficult,

 

00:08:05.387 --> 00:08:06.348

because it's easy for us,

 

00:08:06.408 --> 00:08:07.929

like picking up a glass and

 

00:08:07.969 --> 00:08:09.210

being able to bring it to

 

00:08:09.250 --> 00:08:10.611

our mouths to be able to drink from it.

 

00:08:11.692 --> 00:08:13.954

So that misconception has led to some,

 

00:08:14.295 --> 00:08:15.616

I've heard things like, well, you know,

 

00:08:16.799 --> 00:08:19.200

OpenAI, GenA, ChatGPT,

 

00:08:19.901 --> 00:08:21.421

it passed the bar and it's

 

00:08:21.441 --> 00:08:22.482

going to be smarter than

 

00:08:22.522 --> 00:08:23.942

lawyers within six months.

 

00:08:25.603 --> 00:08:27.684

Not necessarily because the

 

00:08:27.864 --> 00:08:29.125

kinds of intelligence that

 

00:08:29.185 --> 00:08:30.986

you need to have to be a lawyer isn't

 

00:08:32.186 --> 00:08:33.887

isn't exactly the way that

 

00:08:33.927 --> 00:08:35.828

ChatGPT is working yet.

 

00:08:35.988 --> 00:08:36.848

One of the big problems with

 

00:08:36.868 --> 00:08:37.949

ChatGPT in the legal

 

00:08:37.969 --> 00:08:40.730

profession so far has been the, quote,

 

00:08:40.810 --> 00:08:42.511

hallucinations create

 

00:08:42.551 --> 00:08:44.291

statistically probable

 

00:08:44.852 --> 00:08:47.133

cases that could have been

 

00:08:47.793 --> 00:08:48.553

built a precedent.

 

00:08:49.534 --> 00:08:51.054

But it doesn't necessarily

 

00:08:51.094 --> 00:08:52.075

go and check to see if

 

00:08:52.135 --> 00:08:54.576

those cases are actual cases.

 

00:08:54.636 --> 00:08:56.236

Maybe it just invented some

 

00:08:56.276 --> 00:08:56.997

of those cases.

 

00:08:58.337 --> 00:08:59.578

So that misconception,

 

00:08:59.598 --> 00:09:00.539

the misconception that the

 

00:09:00.659 --> 00:09:02.381

AI is always smarter than

 

00:09:02.481 --> 00:09:04.162

us and that it's always

 

00:09:04.222 --> 00:09:05.444

going to be better than humans.

 

00:09:05.544 --> 00:09:06.705

I think that that's a really

 

00:09:07.125 --> 00:09:08.627

that's a big one that we

 

00:09:08.707 --> 00:09:10.428

just as as all of us need

 

00:09:10.448 --> 00:09:12.330

to be thinking about what

 

00:09:12.610 --> 00:09:13.771

where it's really smarter,

 

00:09:13.831 --> 00:09:15.112

what the best use cases are

 

00:09:15.253 --> 00:09:16.494

and what where we really

 

00:09:16.534 --> 00:09:17.695

need to have humans still

 

00:09:17.755 --> 00:09:18.976

involved with the thinking and nuance.

 

00:09:20.177 --> 00:09:21.398

The other big misconception

 

00:09:21.958 --> 00:09:23.418

that is concerning and one

 

00:09:23.458 --> 00:09:25.179

that at Protect AI we're

 

00:09:25.219 --> 00:09:27.600

very much hoping to help with quite a bit,

 

00:09:28.220 --> 00:09:30.541

which is that some people

 

00:09:30.601 --> 00:09:33.122

kind of think that AI just

 

00:09:33.202 --> 00:09:35.263

got invented in November

 

00:09:35.263 --> 00:09:36.603

2022 and that ChatGPT was

 

00:09:36.643 --> 00:09:38.364

the first real deployment of AI.

 

00:09:38.724 --> 00:09:39.705

That's not the case.

 

00:09:39.785 --> 00:09:41.065

I had mentioned you being at

 

00:09:41.145 --> 00:09:42.506

IBM when Watson for Cyber

 

00:09:42.546 --> 00:09:43.386

was being trained.

 

00:09:43.786 --> 00:09:45.347

And that was eight years ago.

 

00:09:46.307 --> 00:09:47.448

Machine learning models have

 

00:09:47.488 --> 00:09:49.088

been in use in a number of

 

00:09:49.148 --> 00:09:50.889

different sectors and organizations.

 

00:09:51.229 --> 00:09:52.630

Big Pharma, for example,

 

00:09:52.690 --> 00:09:54.311

for trials and looking at

 

00:09:54.371 --> 00:09:57.192

data around new medications, for example,

 

00:09:57.292 --> 00:09:58.413

and financial services.

 

00:09:58.793 --> 00:09:59.954

Some of us may already be

 

00:10:00.134 --> 00:10:01.995

using ML within our

 

00:10:02.055 --> 00:10:02.875

financial services

 

00:10:02.915 --> 00:10:04.556

portfolio if anybody signed

 

00:10:04.576 --> 00:10:07.057

up for a quantitative or robo advisor,

 

00:10:07.077 --> 00:10:07.377

right?

 

00:10:07.397 --> 00:10:08.438

That's machine learning driven.

 

00:10:09.218 --> 00:10:10.619

So the misconception that

 

00:10:10.679 --> 00:10:12.540

this just happened and now

 

00:10:12.580 --> 00:10:13.640

we have to think about how

 

00:10:13.680 --> 00:10:15.081

we secure it going forward.

 

00:10:16.153 --> 00:10:16.713

It's here.

 

00:10:16.894 --> 00:10:18.355

It's been in use in many,

 

00:10:18.395 --> 00:10:19.535

many organizations and

 

00:10:19.595 --> 00:10:21.957

sectors for a significant amount of time.

 

00:10:22.057 --> 00:10:23.058

And the most important thing

 

00:10:23.118 --> 00:10:24.759

is to understand that we

 

00:10:24.799 --> 00:10:26.881

need to build security into

 

00:10:26.941 --> 00:10:28.762

that lifecycle the same way

 

00:10:28.802 --> 00:10:29.863

that we've built it into

 

00:10:29.983 --> 00:10:31.604

our other security lifecycle.

 

00:10:31.664 --> 00:10:33.445

So if you're thinking about DevSecOps,

 

00:10:33.485 --> 00:10:34.466

build security into your

 

00:10:34.486 --> 00:10:35.386

development lifecycle.

 

00:10:36.007 --> 00:10:36.827

MLSecOps,

 

00:10:37.128 --> 00:10:38.509

build security into your machine

 

00:10:38.549 --> 00:10:39.429

learning lifecycle.

 

00:10:39.750 --> 00:10:39.970

Great.

 

00:10:40.630 --> 00:10:42.631

I think that with the

 

00:10:42.651 --> 00:10:44.111

surprise everybody had with

 

00:10:44.251 --> 00:10:46.171

AI when it exploded last year,

 

00:10:46.632 --> 00:10:47.832

my thoughts did go to, well,

 

00:10:47.872 --> 00:10:49.772

we have had ML and a lot of

 

00:10:49.812 --> 00:10:51.253

people don't realize just

 

00:10:51.613 --> 00:10:53.273

we have had ML and what

 

00:10:53.313 --> 00:10:55.174

that has been built into.

 

00:10:55.194 --> 00:10:58.375

And with AI, like, yes, it is smart,

 

00:10:58.595 --> 00:10:59.655

but it doesn't have the

 

00:10:59.695 --> 00:11:00.975

contextual knowledge that

 

00:11:01.055 --> 00:11:03.356

humans have from our own experiences.

 

00:11:04.102 --> 00:11:05.884

And that I think is an area

 

00:11:05.924 --> 00:11:08.446

that AI would need to train or learn in.

 

00:11:08.666 --> 00:11:12.329

And that's a very nuanced

 

00:11:12.429 --> 00:11:13.690

thing for it to be able to

 

00:11:13.730 --> 00:11:15.051

do as a machine.

 

00:11:16.672 --> 00:11:18.333

What have you learned,

 

00:11:18.754 --> 00:11:20.095

something new that you have

 

00:11:20.155 --> 00:11:22.136

learned in this field now

 

00:11:22.176 --> 00:11:25.039

that AI has been such a big

 

00:11:25.119 --> 00:11:26.560

piece of this now?

 

00:11:27.637 --> 00:11:27.857

Well,

 

00:11:27.957 --> 00:11:29.799

I've certainly been on a journey of

 

00:11:30.120 --> 00:11:31.901

learning a lot more about

 

00:11:31.982 --> 00:11:32.902

how machine learning

 

00:11:32.983 --> 00:11:35.205

engineers and data scientists work.

 

00:11:35.914 --> 00:11:38.497

because I came to it from security.

 

00:11:38.557 --> 00:11:40.559

So I brought all my security knowledge in,

 

00:11:40.859 --> 00:11:42.781

but I wasn't a deep machine learning.

 

00:11:42.841 --> 00:11:44.723

I'm not an ML engineer myself.

 

00:11:45.244 --> 00:11:46.685

So it's really learning

 

00:11:46.745 --> 00:11:49.208

about data and how

 

00:11:49.308 --> 00:11:50.869

different data is in the

 

00:11:50.910 --> 00:11:52.651

machine learning and AI world,

 

00:11:52.972 --> 00:11:55.114

how different it is thinking about data.

 

00:11:55.614 --> 00:11:56.195

For example,

 

00:11:56.255 --> 00:11:57.536

if you're training a machine

 

00:11:57.576 --> 00:11:58.117

learning model,

 

00:11:58.470 --> 00:12:00.131

you may well want to train

 

00:12:00.191 --> 00:12:02.132

it on real live data.

 

00:12:02.772 --> 00:12:04.033

It may be anonymized.

 

00:12:04.053 --> 00:12:04.273

I mean,

 

00:12:04.734 --> 00:12:06.214

you may give instructions to the

 

00:12:06.254 --> 00:12:07.355

model to not give out

 

00:12:07.415 --> 00:12:08.716

information about the training data,

 

00:12:08.736 --> 00:12:10.056

but you want to train it on

 

00:12:10.097 --> 00:12:12.298

live data very often so

 

00:12:12.338 --> 00:12:13.478

that you know that it's

 

00:12:13.498 --> 00:12:15.840

been fit properly for its purpose.

 

00:12:16.200 --> 00:12:17.261

Whereas when you're thinking

 

00:12:17.321 --> 00:12:18.501

about software development,

 

00:12:19.102 --> 00:12:20.542

and I've been doing this

 

00:12:20.602 --> 00:12:23.744

for a long time with secure life cycles,

 

00:12:26.378 --> 00:12:28.960

With this development, as you look at that,

 

00:12:29.040 --> 00:12:31.081

we were always about don't

 

00:12:31.181 --> 00:12:32.982

let the real data out.

 

00:12:33.022 --> 00:12:34.123

Don't put production data

 

00:12:34.203 --> 00:12:35.284

into your training data

 

00:12:35.324 --> 00:12:36.224

that you wanted to always

 

00:12:36.264 --> 00:12:37.585

have the separation of the

 

00:12:37.645 --> 00:12:39.206

data and a machine learning.

 

00:12:39.266 --> 00:12:42.348

It's a very different world with the data.

 

00:12:43.229 --> 00:12:44.129

Has there been any

 

00:12:44.269 --> 00:12:47.391

surprising lessons or key

 

00:12:47.492 --> 00:12:48.692

takeaways that you have

 

00:12:49.233 --> 00:12:51.294

gleaned over this process?

 

00:12:52.737 --> 00:12:53.197

Yeah, I mean,

 

00:12:53.237 --> 00:12:54.578

I think as we're looking at

 

00:12:54.758 --> 00:12:55.939

all of the great research

 

00:12:55.979 --> 00:12:58.160

that's being done by the community,

 

00:12:58.921 --> 00:13:00.862

there have been a series of lessons.

 

00:13:00.922 --> 00:13:02.463

A big lesson was in the

 

00:13:02.503 --> 00:13:04.064

supply chain of machine

 

00:13:04.084 --> 00:13:06.185

learning itself that we need to,

 

00:13:06.265 --> 00:13:06.925

of course,

 

00:13:07.426 --> 00:13:09.287

worry about the data and the

 

00:13:09.347 --> 00:13:10.808

models in the machine learning.

 

00:13:10.828 --> 00:13:11.708

But the machine learning is

 

00:13:11.868 --> 00:13:14.610

running on top of a platform very often.

 

00:13:15.429 --> 00:13:17.391

or a management tool for workflow.

 

00:13:17.671 --> 00:13:18.591

So we have to also make sure

 

00:13:18.631 --> 00:13:20.473

that that supply chain is secure.

 

00:13:20.513 --> 00:13:22.474

We can't just say, oh,

 

00:13:22.834 --> 00:13:24.535

machine learning's just in

 

00:13:24.595 --> 00:13:26.156

its own little bubble and it's separate.

 

00:13:26.537 --> 00:13:28.438

It's running in an environment.

 

00:13:28.458 --> 00:13:29.579

So we also have to make sure

 

00:13:29.599 --> 00:13:30.599

that we've secured that.

 

00:13:31.020 --> 00:13:31.920

So that's been a big

 

00:13:31.940 --> 00:13:33.021

eye-opener for me about

 

00:13:35.203 --> 00:13:37.385

how we have to apply the security,

 

00:13:37.645 --> 00:13:38.646

not just to the machine

 

00:13:38.666 --> 00:13:40.107

learning operations lifecycle,

 

00:13:40.127 --> 00:13:42.289

but also to the environment

 

00:13:42.329 --> 00:13:43.230

that the machine learning,

 

00:13:43.750 --> 00:13:45.251

the models are going to be running in.

 

00:13:46.853 --> 00:13:47.513

What are some of the

 

00:13:47.573 --> 00:13:49.575

challenges that you think

 

00:13:49.635 --> 00:13:51.076

we will see as we move

 

00:13:51.156 --> 00:13:51.917

forward and begin

 

00:13:52.017 --> 00:13:53.638

implementing AI into our

 

00:13:54.058 --> 00:13:55.119

different programs,

 

00:13:55.139 --> 00:13:56.520

and especially for security

 

00:13:56.581 --> 00:13:58.262

programs from that perspective?

 

00:13:59.292 --> 00:14:00.293

I think the first big

 

00:14:00.313 --> 00:14:03.417

challenge is just going to

 

00:14:03.437 --> 00:14:05.158

be able to weave security

 

00:14:05.299 --> 00:14:08.082

in to the entire life cycle.

 

00:14:08.762 --> 00:14:10.684

Being able to get governance

 

00:14:11.105 --> 00:14:12.106

and be able to see, know,

 

00:14:12.146 --> 00:14:13.247

and manage what's happening

 

00:14:13.287 --> 00:14:14.729

in our machine learning life cycles.

 

00:14:15.174 --> 00:14:16.135

starting from where those

 

00:14:16.175 --> 00:14:17.456

models are coming from as

 

00:14:17.496 --> 00:14:18.197

sort of a software

 

00:14:18.237 --> 00:14:19.978

composition analysis around

 

00:14:19.998 --> 00:14:21.059

the models and create a

 

00:14:21.079 --> 00:14:21.760

bill of materials.

 

00:14:22.400 --> 00:14:23.642

What models are we using?

 

00:14:23.882 --> 00:14:24.682

Who was training them?

 

00:14:24.823 --> 00:14:25.944

What data are we using?

 

00:14:26.364 --> 00:14:27.645

Where have those data sets

 

00:14:27.685 --> 00:14:29.166

been used within our environment?

 

00:14:29.186 --> 00:14:30.027

So I think that's a really

 

00:14:30.087 --> 00:14:31.849

big challenge is to weave

 

00:14:31.889 --> 00:14:33.710

the security into the process.

 

00:14:35.072 --> 00:14:36.113

What about wins?

 

00:14:37.294 --> 00:14:38.815

Has it helped in any way as

 

00:14:38.875 --> 00:14:42.158

far as implementing security programs?

 

00:14:43.422 --> 00:14:44.482

I think that there's a lot

 

00:14:44.502 --> 00:14:45.843

of opportunity for big wins.

 

00:14:45.863 --> 00:14:47.544

We've already seen machine

 

00:14:47.584 --> 00:14:48.744

learning help us quite a

 

00:14:48.804 --> 00:14:50.925

bit in anomaly detection

 

00:14:51.365 --> 00:14:52.866

and in classification.

 

00:14:53.026 --> 00:14:54.426

Is this fish, or is this not a fish?

 

00:14:54.487 --> 00:14:55.107

Is it perfect?

 

00:14:55.427 --> 00:14:55.547

No.

 

00:14:55.987 --> 00:14:59.949

But already, most of us, if you're using

 

00:15:01.109 --> 00:15:01.870

email from one of the big

 

00:15:01.950 --> 00:15:03.472

providers or Microsoft or Google,

 

00:15:03.712 --> 00:15:04.853

machine learning is helping

 

00:15:04.913 --> 00:15:06.615

to make sure that the

 

00:15:06.855 --> 00:15:08.457

emails that get to you are

 

00:15:08.497 --> 00:15:09.337

legitimate emails.

 

00:15:09.418 --> 00:15:10.539

Again, not perfect,

 

00:15:10.739 --> 00:15:12.280

but certainly giving you a

 

00:15:12.340 --> 00:15:14.122

leg up and reducing the

 

00:15:14.182 --> 00:15:15.644

noise as much as possible.

 

00:15:15.724 --> 00:15:17.545

So that's been a nice win.

 

00:15:17.645 --> 00:15:19.167

Also looking at the

 

00:15:19.227 --> 00:15:21.449

anomalies and deviations from the norm,

 

00:15:21.469 --> 00:15:21.990

because that's where

 

00:15:22.030 --> 00:15:22.871

something machine learning

 

00:15:22.891 --> 00:15:24.692

is really good at pattern matching.

 

00:15:24.712 --> 00:15:25.293

Yeah.

 

00:15:25.613 --> 00:15:27.474

and finding patterns and very,

 

00:15:27.514 --> 00:15:29.234

very vast amounts of data.

 

00:15:29.614 --> 00:15:31.655

So a human being seeing that

 

00:15:31.695 --> 00:15:32.875

something looks anomalous

 

00:15:32.935 --> 00:15:33.816

or off the baseline,

 

00:15:34.376 --> 00:15:35.476

if you've got thousands or

 

00:15:35.476 --> 00:15:37.157

100,000 users at your company,

 

00:15:37.177 --> 00:15:38.077

that could be really hard

 

00:15:38.117 --> 00:15:39.757

for a human to detect.

 

00:15:39.998 --> 00:15:40.678

But that's something where

 

00:15:40.698 --> 00:15:42.338

machine learning can be very,

 

00:15:42.398 --> 00:15:43.419

very good at showing you

 

00:15:43.459 --> 00:15:44.339

those little bits.

 

00:15:44.839 --> 00:15:46.581

The anomalous flag or alert

 

00:15:46.661 --> 00:15:47.922

may not mean that something

 

00:15:47.982 --> 00:15:49.163

nefarious is happening in

 

00:15:49.203 --> 00:15:50.024

your organization,

 

00:15:50.404 --> 00:15:51.725

but it gives you a good

 

00:15:51.865 --> 00:15:53.146

view into where those

 

00:15:53.247 --> 00:15:54.468

analysts can look to see,

 

00:15:54.848 --> 00:15:56.209

is this something that we expected?

 

00:15:56.289 --> 00:15:57.951

Is this a human being a human,

 

00:15:58.311 --> 00:15:59.532

or is this a human being a

 

00:15:59.592 --> 00:16:00.293

malicious human?

 

00:16:00.773 --> 00:16:03.856

Right, right.

 

00:16:03.876 --> 00:16:04.957

What are you most excited

 

00:16:05.017 --> 00:16:06.498

about at the intersection

 

00:16:06.538 --> 00:16:07.960

of AI and security?

 

00:16:09.275 --> 00:16:10.536

I am very excited about the

 

00:16:10.596 --> 00:16:12.258

ability to look at the vast

 

00:16:12.298 --> 00:16:13.539

amounts of data and start

 

00:16:13.559 --> 00:16:14.800

to see perhaps patterns

 

00:16:14.840 --> 00:16:15.821

that we hadn't been able to

 

00:16:15.861 --> 00:16:17.363

see before about attacks

 

00:16:17.483 --> 00:16:18.784

and also to help us

 

00:16:18.864 --> 00:16:19.805

understand again that

 

00:16:19.885 --> 00:16:21.587

deviation from the norm.

 

00:16:21.627 --> 00:16:22.568

I think that those are two

 

00:16:22.708 --> 00:16:23.609

really wonderful

 

00:16:23.809 --> 00:16:24.870

applications and we're

 

00:16:24.890 --> 00:16:26.632

seeing some nice advances there.

 

00:16:27.553 --> 00:16:29.574

There's also potentially

 

00:16:29.755 --> 00:16:30.655

getting a little bit more

 

00:16:30.836 --> 00:16:33.358

agency and response on, you know,

 

00:16:33.458 --> 00:16:35.300

if something looks wrong,

 

00:16:36.060 --> 00:16:37.802

what action do we take in response?

 

00:16:37.862 --> 00:16:40.104

I think that might be, you know,

 

00:16:40.224 --> 00:16:41.245

as we get better, we train,

 

00:16:41.265 --> 00:16:42.446

we go in observation mode,

 

00:16:42.506 --> 00:16:43.467

will we be able to get a

 

00:16:43.487 --> 00:16:44.488

little bit more proactive?

 

00:16:44.868 --> 00:16:46.269

That's fairly exciting.

 

00:16:46.389 --> 00:16:47.110

And then the other thing is

 

00:16:47.150 --> 00:16:47.771

around training.

 

00:16:48.411 --> 00:16:49.532

Because some of these tools

 

00:16:49.592 --> 00:16:51.294

can really make it easier.

 

00:16:51.334 --> 00:16:52.795

They emulate what it is to

 

00:16:52.835 --> 00:16:53.856

be talking to a person.

 

00:16:53.896 --> 00:16:54.697

So give us some good

 

00:16:54.717 --> 00:16:55.938

opportunity for training

 

00:16:55.978 --> 00:16:57.360

and improving that training.

 

00:16:58.321 --> 00:17:01.043

So you've been in cybersecurity for,

 

00:17:01.143 --> 00:17:03.125

as you said, across three decades.

 

00:17:03.846 --> 00:17:05.708

What has been one of the

 

00:17:05.768 --> 00:17:06.869

greatest pieces of advice

 

00:17:06.889 --> 00:17:08.090

that maybe a mentor has

 

00:17:08.130 --> 00:17:10.833

given you that has helped

 

00:17:10.873 --> 00:17:11.874

you throughout your career?

 

00:17:14.803 --> 00:17:17.486

Just believing that you can

 

00:17:17.566 --> 00:17:19.207

do it is actually really

 

00:17:19.728 --> 00:17:21.309

one of my very first mentors.

 

00:17:22.090 --> 00:17:23.951

I was working in editorial

 

00:17:24.172 --> 00:17:25.953

and my love of systems,

 

00:17:26.033 --> 00:17:27.054

I had started to do

 

00:17:27.174 --> 00:17:29.016

acquisitions of software

 

00:17:29.056 --> 00:17:30.077

for the textbooks that we

 

00:17:30.117 --> 00:17:30.738

were publishing.

 

00:17:31.478 --> 00:17:32.699

And the person that hired me

 

00:17:32.739 --> 00:17:33.880

into my first IT role,

 

00:17:33.940 --> 00:17:35.362

she saw me and that skill

 

00:17:35.502 --> 00:17:36.963

and thought I could do it.

 

00:17:36.983 --> 00:17:38.264

And I was like, I don't know.

 

00:17:38.284 --> 00:17:40.646

I think I have to have more tech or this.

 

00:17:40.686 --> 00:17:42.288

And she said, no, you've got it.

 

00:17:42.328 --> 00:17:43.328

You've already got that.

 

00:17:43.469 --> 00:17:44.570

And I've heard that from

 

00:17:44.670 --> 00:17:46.091

other people in my career.

 

00:17:46.131 --> 00:17:47.372

There's a sense of that

 

00:17:47.432 --> 00:17:48.673

somehow you're going to get

 

00:17:48.753 --> 00:17:51.235

to a level suddenly and then you're that.

 

00:17:51.455 --> 00:17:51.996

But it's

 

00:17:52.236 --> 00:17:52.416

No,

 

00:17:52.636 --> 00:17:54.298

you go out and you do it and you build

 

00:17:54.338 --> 00:17:55.338

those skills and nobody

 

00:17:55.378 --> 00:17:56.399

comes along and says, oh,

 

00:17:56.880 --> 00:17:58.321

now you're a threat hunter

 

00:17:58.341 --> 00:17:59.942

or now you're a SOC analyst.

 

00:17:59.962 --> 00:18:01.043

You might get that title.

 

00:18:01.503 --> 00:18:02.904

But, you know,

 

00:18:03.044 --> 00:18:03.965

it's really it's the work

 

00:18:04.005 --> 00:18:06.527

that you've done to get into that role.

 

00:18:06.607 --> 00:18:07.928

So I think just really

 

00:18:08.308 --> 00:18:10.150

understanding and having

 

00:18:10.230 --> 00:18:11.691

faith in yourself, do the work,

 

00:18:12.151 --> 00:18:13.412

but then also give yourself

 

00:18:13.492 --> 00:18:15.133

credit that you have done the work, which,

 

00:18:15.734 --> 00:18:15.934

you know,

 

00:18:15.954 --> 00:18:16.815

that's always been a little bit

 

00:18:16.835 --> 00:18:17.255

hard for me.

 

00:18:17.893 --> 00:18:18.073

Yeah,

 

00:18:18.113 --> 00:18:19.854

I think that can be hard for a lot of

 

00:18:19.894 --> 00:18:21.335

people to there's, you know,

 

00:18:21.355 --> 00:18:22.576

we hear about imposter

 

00:18:22.596 --> 00:18:24.338

syndrome and not feeling like, you know,

 

00:18:24.378 --> 00:18:25.979

you're supposed to be here,

 

00:18:26.139 --> 00:18:26.659

but we're here.

 

00:18:26.679 --> 00:18:27.980

We're doing the work.

 

00:18:28.060 --> 00:18:29.261

So yeah.

 

00:18:29.581 --> 00:18:30.782

What advice would you have

 

00:18:30.842 --> 00:18:32.083

to somebody who is looking

 

00:18:32.123 --> 00:18:33.704

at getting into either

 

00:18:33.764 --> 00:18:35.666

cybersecurity or specifically AI?

 

00:18:37.541 --> 00:18:38.341

In both cases,

 

00:18:38.521 --> 00:18:40.122

what is it that you really love to do?

 

00:18:40.342 --> 00:18:41.403

Why do you want to do it?

 

00:18:41.603 --> 00:18:42.503

And I've talked to people

 

00:18:42.563 --> 00:18:44.264

going into security and now

 

00:18:44.284 --> 00:18:45.724

security of AI because it's

 

00:18:45.784 --> 00:18:47.405

the one that's getting a

 

00:18:47.445 --> 00:18:48.486

lot of attention right now.

 

00:18:49.206 --> 00:18:51.407

And I've heard, well,

 

00:18:51.427 --> 00:18:52.887

because I don't want to be unemployed.

 

00:18:53.788 --> 00:18:55.028

I want to make a lot of money.

 

00:18:55.689 --> 00:18:57.029

I want to get the bad guys.

 

00:18:57.129 --> 00:18:58.350

Okay, those are all good things.

 

00:18:58.370 --> 00:18:59.910

Think about why it is that

 

00:18:59.930 --> 00:19:01.411

you're going into this industry.

 

00:19:01.627 --> 00:19:01.987

field,

 

00:19:02.567 --> 00:19:04.488

and then which part of the field

 

00:19:04.548 --> 00:19:05.708

that really interests you.

 

00:19:06.008 --> 00:19:07.609

There are highly technical

 

00:19:07.649 --> 00:19:09.970

roles in security and security for AI,

 

00:19:10.310 --> 00:19:11.570

and there are less technical roles.

 

00:19:11.630 --> 00:19:13.811

We need lawyers, we need graphic designers,

 

00:19:13.871 --> 00:19:15.211

we need sociologists to

 

00:19:15.251 --> 00:19:15.971

help us understand the

 

00:19:15.991 --> 00:19:17.052

psychology of the people

 

00:19:17.072 --> 00:19:18.412

that are responding to these systems.

 

00:19:18.812 --> 00:19:19.812

We need a whole lot of

 

00:19:19.952 --> 00:19:21.193

insight and intelligence

 

00:19:21.353 --> 00:19:22.793

around these problems of

 

00:19:22.853 --> 00:19:24.874

both cyber and cyber and AI.

 

00:19:26.167 --> 00:19:27.309

So what is it you love to do?

 

00:19:27.670 --> 00:19:28.772

Because if it's your passion

 

00:19:28.812 --> 00:19:29.714

and you love to do it,

 

00:19:29.754 --> 00:19:31.818

you're much more likely to succeed in it,

 

00:19:32.419 --> 00:19:33.080

especially because you're

 

00:19:33.100 --> 00:19:34.142

not going to show up every day.

 

00:19:34.523 --> 00:19:35.003

I hate this.

 

00:19:35.023 --> 00:19:36.146

You're going to show up and be like,

 

00:19:36.266 --> 00:19:38.009

I really I want to do this work.

 

00:19:38.892 --> 00:19:41.454

So that's really understand

 

00:19:41.534 --> 00:19:42.534

what it is you want to do,

 

00:19:42.594 --> 00:19:43.775

start to get a feel for

 

00:19:43.895 --> 00:19:45.896

which area is most interesting to you.

 

00:19:46.176 --> 00:19:47.237

And then once you've got that,

 

00:19:47.477 --> 00:19:48.818

you can get better advice

 

00:19:48.878 --> 00:19:50.539

and guidance on exactly what to do.

 

00:19:50.559 --> 00:19:51.379

If somebody says that they

 

00:19:51.419 --> 00:19:52.600

want a really technical path,

 

00:19:52.800 --> 00:19:53.541

they don't like people,

 

00:19:53.821 --> 00:19:54.781

they want to sit with their

 

00:19:54.821 --> 00:19:55.742

computer all the time,

 

00:19:56.022 --> 00:19:58.163

and they just want to look at code, okay,

 

00:19:58.443 --> 00:19:59.344

then they might be really

 

00:19:59.384 --> 00:20:00.605

good as a threat hunter,

 

00:20:00.905 --> 00:20:02.886

they might be good as a malware engineer,

 

00:20:02.966 --> 00:20:04.027

or be able to

 

00:20:07.205 --> 00:20:08.386

to look for vulnerabilities

 

00:20:08.466 --> 00:20:10.208

in other systems, be a red teamer.

 

00:20:10.468 --> 00:20:11.429

There are a lot of things

 

00:20:11.449 --> 00:20:13.410

that might be better for that person.

 

00:20:13.450 --> 00:20:15.132

And then if that's where they wanted to go,

 

00:20:15.152 --> 00:20:15.972

I would say things like

 

00:20:16.273 --> 00:20:16.973

start going to the more

 

00:20:17.013 --> 00:20:19.595

technical conferences, go to,

 

00:20:19.916 --> 00:20:22.198

go to CFPs and compete in CFPs,

 

00:20:22.258 --> 00:20:26.982

capture the, not CFPs, CFTs, CTFs, sorry.

 

00:20:27.182 --> 00:20:27.842

Exactly.

 

00:20:29.224 --> 00:20:30.846

capture that, call for papers,

 

00:20:31.928 --> 00:20:34.612

capture the flag and participate in that.

 

00:20:35.012 --> 00:20:35.774

And then that's going to

 

00:20:35.794 --> 00:20:36.575

start building out a

 

00:20:36.595 --> 00:20:37.656

network and people are going to, oh,

 

00:20:37.716 --> 00:20:39.299

this person has mad skills

 

00:20:39.880 --> 00:20:41.262

doing this kinds of work.

 

00:20:41.722 --> 00:20:43.164

Then that network could help

 

00:20:43.225 --> 00:20:44.166

you find a job.

 

00:20:44.346 --> 00:20:44.647

If you,

 

00:20:45.107 --> 00:20:45.968

As you're sitting and

 

00:20:46.009 --> 00:20:46.790

interrogating yourself,

 

00:20:46.810 --> 00:20:48.432

which part of this I like the most of,

 

00:20:48.492 --> 00:20:48.772

it's like,

 

00:20:48.912 --> 00:20:50.695

I really like the policy and I

 

00:20:50.755 --> 00:20:51.636

want to write the policy,

 

00:20:51.656 --> 00:20:52.797

but I want it to be the

 

00:20:52.837 --> 00:20:54.900

policy that gets adopted up in Washington,

 

00:20:54.940 --> 00:20:55.401

D.C.

 

00:20:55.681 --> 00:20:56.522

that becomes law.

 

00:20:56.903 --> 00:20:57.944

OK, in that case,

 

00:20:57.964 --> 00:20:58.825

you're probably going to be

 

00:20:58.865 --> 00:21:00.447

veering more towards at

 

00:21:00.548 --> 00:21:01.609

least the government and

 

00:21:01.649 --> 00:21:03.191

potentially even legal policy.

 

00:21:03.539 --> 00:21:04.519

so that you can go up on the

 

00:21:04.539 --> 00:21:05.900

hill and understand how to

 

00:21:05.960 --> 00:21:07.580

create laws and what those laws are like.

 

00:21:08.380 --> 00:21:09.741

And that would be a different path.

 

00:21:10.121 --> 00:21:11.481

So it's most important first

 

00:21:11.521 --> 00:21:12.621

to really understand which

 

00:21:12.801 --> 00:21:16.862

area of this practice you want to go into,

 

00:21:17.063 --> 00:21:18.163

and then you can start to

 

00:21:18.363 --> 00:21:20.924

tune your own path forward.

 

00:21:21.244 --> 00:21:21.504

However,

 

00:21:21.524 --> 00:21:23.524

there's one thing that in all of it,

 

00:21:23.684 --> 00:21:24.945

I think really does matter,

 

00:21:25.145 --> 00:21:26.305

which is networking.

 

00:21:27.745 --> 00:21:29.445

networking, finding people,

 

00:21:29.946 --> 00:21:31.006

even if you're mostly,

 

00:21:31.146 --> 00:21:32.086

I'm a huge introvert,

 

00:21:32.166 --> 00:21:33.746

even if you're a really big introvert,

 

00:21:33.886 --> 00:21:35.327

finding your people that

 

00:21:35.367 --> 00:21:36.847

you can talk to and finding

 

00:21:36.867 --> 00:21:38.187

the organizations and the

 

00:21:38.227 --> 00:21:39.568

groups that you feel comfortable with,

 

00:21:39.848 --> 00:21:40.588

they're going to A,

 

00:21:40.688 --> 00:21:41.768

help you really advance

 

00:21:41.868 --> 00:21:44.249

because they know things you don't know.

 

00:21:44.309 --> 00:21:45.509

So that's like, how do you learn?

 

00:21:45.529 --> 00:21:46.569

Sometimes it's by talking to

 

00:21:46.609 --> 00:21:47.229

other smart people.

 

00:21:47.510 --> 00:21:48.250

The other thing that will

 

00:21:48.290 --> 00:21:49.610

really help you advance is

 

00:21:49.690 --> 00:21:51.670

that a lot of times in these groups,

 

00:21:52.011 --> 00:21:53.851

they know someone who's hiring somewhere.

 

00:21:54.571 --> 00:21:55.572

And that can help you get

 

00:21:55.592 --> 00:21:57.154

that job or the next job,

 

00:21:57.294 --> 00:21:58.855

or at least understand what

 

00:21:58.895 --> 00:22:00.096

the skills you need to get

 

00:22:00.136 --> 00:22:00.937

that job that you want.

 

00:22:00.977 --> 00:22:01.918

So then you can create and

 

00:22:01.958 --> 00:22:03.720

craft your own plan for advancement.

 

00:22:04.781 --> 00:22:07.363

Yeah, I completely agree with that.

 

00:22:07.523 --> 00:22:08.624

Like that has been, you know,

 

00:22:08.704 --> 00:22:10.926

definitely helpful for me in my path.

 

00:22:10.966 --> 00:22:11.727

And then also just

 

00:22:11.987 --> 00:22:13.189

understanding what

 

00:22:13.249 --> 00:22:14.370

fulfillment means to you.

 

00:22:15.351 --> 00:22:16.712

in selecting roles and

 

00:22:16.732 --> 00:22:18.033

different projects to work on.

 

00:22:18.073 --> 00:22:18.814

I think, you know,

 

00:22:18.854 --> 00:22:20.495

it helps because if you

 

00:22:20.595 --> 00:22:21.957

don't know what fulfills you,

 

00:22:21.977 --> 00:22:23.438

you don't know what your values are,

 

00:22:23.618 --> 00:22:24.719

you could get into a role

 

00:22:24.759 --> 00:22:25.800

and you can make tons of money,

 

00:22:25.840 --> 00:22:26.921

but it'll feel like a drag

 

00:22:27.041 --> 00:22:27.742

every single day.

 

00:22:28.322 --> 00:22:30.264

So, I mean, you gotta, you know, find it,

 

00:22:30.484 --> 00:22:32.987

what it is that you love and drives you.

 

00:22:33.127 --> 00:22:34.168

So, yeah.

 

00:22:34.958 --> 00:22:37.519

Diana, thank you for your time today.

 

00:22:37.539 --> 00:22:39.559

I just wanted to say I

 

00:22:39.599 --> 00:22:40.920

really appreciate your time

 

00:22:41.100 --> 00:22:42.220

and thank you for coming on

 

00:22:42.340 --> 00:22:43.800

as my first guest for the

 

00:22:44.281 --> 00:22:46.081

OnCyber AI podcast.

 

00:22:46.181 --> 00:22:50.442

We love you here at ITSP.

 

00:22:50.582 --> 00:22:51.022

So yes,

 

00:22:51.142 --> 00:22:53.443

thank you so much and thank you to

 

00:22:53.603 --> 00:22:54.503

everybody tuning in.

 

00:22:55.624 --> 00:22:56.064

Thanks.

 

00:22:56.804 --> 00:22:57.124

Bye.

 

0:00:02.232 --> 00:00:03.734

Thank you for joining ITSP

 

00:00:03.754 --> 00:00:06.276

Magazine's On Cyber and AI podcast.

 

00:00:06.476 --> 00:00:07.837

I'm your host, Christina Stokes,

 

00:00:08.038 --> 00:00:10.160

and today's guest is Diana Kelly.

 

00:00:10.680 --> 00:00:13.303

I'd love for you to tell us about yourself,

 

00:00:13.323 --> 00:00:13.623

Diana.

 

00:00:14.722 --> 00:00:16.544

sure i am well first of all

 

00:00:16.564 --> 00:00:17.524

thanks for having me here

 

00:00:17.684 --> 00:00:18.925

and congratulations on your

 

00:00:18.966 --> 00:00:20.567

new podcast thank you and

 

00:00:20.707 --> 00:00:23.149

then i have been in i.t for

 

00:00:23.229 --> 00:00:24.650

well over three decades

 

00:00:24.690 --> 00:00:25.431

which is kind of like

 

00:00:25.471 --> 00:00:26.812

mind-blowing when it's like

 

00:00:26.852 --> 00:00:27.993

where did the years go

 

00:00:28.433 --> 00:00:30.235

somewhere um but so i've

 

00:00:30.255 --> 00:00:31.135

seen i've seen a lot of

 

00:00:31.196 --> 00:00:32.497

different technology over

 

00:00:32.557 --> 00:00:34.158

the the course of my career

 

00:00:34.778 --> 00:00:36.660

and i as many people did

 

00:00:36.720 --> 00:00:37.561

started to get really

 

00:00:37.641 --> 00:00:39.843

interested in ai and how ai

 

00:00:39.883 --> 00:00:41.384

was going to change the landscape for us

 

00:00:41.704 --> 00:00:42.704

Starting back when I was at

 

00:00:42.865 --> 00:00:45.046

IBM and we were training

 

00:00:45.086 --> 00:00:47.647

Watson for cybersecurity as

 

00:00:47.727 --> 00:00:49.147

part of what the team was

 

00:00:49.207 --> 00:00:50.808

doing to help others be

 

00:00:50.868 --> 00:00:52.409

able to take advantage of AI.

 

00:00:52.449 --> 00:00:53.910

And then when I was at Microsoft,

 

00:00:53.930 --> 00:00:54.970

I continued to be really

 

00:00:55.090 --> 00:00:56.371

interested in this space

 

00:00:56.431 --> 00:00:57.651

and trying to understand the work,

 

00:00:57.731 --> 00:00:58.712

especially around ethics.

 

00:00:59.528 --> 00:01:01.710

which has brought me to where I am now,

 

00:01:01.750 --> 00:01:02.971

where I am the CISO.

 

00:01:03.051 --> 00:01:04.793

So I'm doing my security,

 

00:01:05.153 --> 00:01:07.515

but at a company called Protect AI,

 

00:01:07.635 --> 00:01:09.077

where we are focusing on

 

00:01:09.137 --> 00:01:11.559

creating tools to secure

 

00:01:11.579 --> 00:01:13.401

and creating a platform to

 

00:01:13.441 --> 00:01:15.342

secure the machine learning

 

00:01:15.423 --> 00:01:18.505

operations lifecycle.

 

00:01:18.766 --> 00:01:20.107

What motivates you every day

 

00:01:20.187 --> 00:01:22.029

and why cybersecurity?

 

00:01:22.069 --> 00:01:23.390

Why have you stayed in this field?

 

00:01:25.335 --> 00:01:26.416

What motivates me every day

 

00:01:26.436 --> 00:01:29.358

is just really making sure,

 

00:01:29.558 --> 00:01:31.799

kind of like not letting

 

00:01:31.819 --> 00:01:32.579

the bad guys win.

 

00:01:32.800 --> 00:01:34.901

If I had to just distill it

 

00:01:34.921 --> 00:01:35.801

down to one thing.

 

00:01:36.001 --> 00:01:38.183

And it started because I got

 

00:01:38.223 --> 00:01:39.303

really fascinated in

 

00:01:39.343 --> 00:01:41.304

technology way back in the

 

00:01:41.785 --> 00:01:43.005

1970s and what could be

 

00:01:43.065 --> 00:01:44.346

done with collaborative

 

00:01:44.406 --> 00:01:45.527

computing and connecting

 

00:01:45.587 --> 00:01:46.948

people and allowing us all

 

00:01:46.988 --> 00:01:47.508

to work together.

 

00:01:47.868 --> 00:01:49.069

So I was just so excited by

 

00:01:49.129 --> 00:01:49.949

what could be done.

 

00:01:50.150 --> 00:01:51.350

By the time I was practicing

 

00:01:51.430 --> 00:01:52.151

professionally,

 

00:01:53.091 --> 00:01:54.712

networks had been joined together.

 

00:01:54.732 --> 00:01:56.192

Email was starting to become

 

00:01:56.232 --> 00:01:56.933

something that was

 

00:01:57.433 --> 00:01:58.774

commonplace in the workforce.

 

00:01:58.834 --> 00:02:00.174

So I know for a lot of people, you're like,

 

00:02:00.214 --> 00:02:01.595

there was a time before email.

 

00:02:02.075 --> 00:02:03.055

And it's like, yes,

 

00:02:03.095 --> 00:02:05.276

we would also ride our horses to work.

 

00:02:07.437 --> 00:02:08.398

But in any case...

 

00:02:09.458 --> 00:02:10.699

I had built and was very

 

00:02:10.799 --> 00:02:12.180

proud of a network for our

 

00:02:12.220 --> 00:02:13.161

startup that I was working

 

00:02:13.201 --> 00:02:14.521

for in Cambridge, Massachusetts.

 

00:02:14.702 --> 00:02:16.623

And this startup wanted to

 

00:02:16.643 --> 00:02:19.445

start distributing our patches using FTP,

 

00:02:19.505 --> 00:02:20.445

which was really kind of

 

00:02:20.465 --> 00:02:21.626

groundbreaking at the time.

 

00:02:21.666 --> 00:02:22.266

It was like, oh,

 

00:02:22.326 --> 00:02:23.607

instead of having to snail

 

00:02:23.627 --> 00:02:25.268

mail a patch on a disk,

 

00:02:25.629 --> 00:02:26.509

I know people are like,

 

00:02:26.669 --> 00:02:28.550

I swear there's a horse involved in here.

 

00:02:28.590 --> 00:02:31.472

So instead of snail mailing

 

00:02:31.512 --> 00:02:32.273

a patch on a disk,

 

00:02:32.313 --> 00:02:33.394

we could just allow our

 

00:02:33.454 --> 00:02:34.874

customers to come in and download it.

 

00:02:34.894 --> 00:02:36.375

So I was really proud that

 

00:02:36.395 --> 00:02:37.116

we had built that out.

 

00:02:37.634 --> 00:02:39.456

But someone attacked that

 

00:02:39.516 --> 00:02:40.917

server that we were using

 

00:02:41.037 --> 00:02:42.898

to do that and got onto my network.

 

00:02:43.779 --> 00:02:46.401

And it was at a very inopportune time.

 

00:02:46.421 --> 00:02:47.862

The first attack that we had

 

00:02:47.922 --> 00:02:49.444

was actually on a Christmas Eve.

 

00:02:50.804 --> 00:02:52.766

They love weekends and holidays.

 

00:02:53.146 --> 00:02:53.546

They do.

 

00:02:53.726 --> 00:02:54.687

They know that people are

 

00:02:54.727 --> 00:02:55.748

going home and that was it.

 

00:02:55.828 --> 00:02:57.049

We had started to go home

 

00:02:57.089 --> 00:02:57.969

and it was just me and a

 

00:02:58.009 --> 00:02:59.230

contractor and we had to

 

00:02:59.290 --> 00:03:00.331

stay that night until we

 

00:03:00.351 --> 00:03:02.192

had figured out where the

 

00:03:02.232 --> 00:03:02.953

attack was coming from,

 

00:03:02.973 --> 00:03:04.754

that we had not only

 

00:03:04.794 --> 00:03:06.175

eradicated it out of our systems,

 

00:03:06.195 --> 00:03:07.276

but we had made sure that

 

00:03:07.416 --> 00:03:08.217

we had prevented the

 

00:03:08.257 --> 00:03:09.218

problem so it couldn't be

 

00:03:10.078 --> 00:03:10.739

they couldn't start the

 

00:03:10.779 --> 00:03:12.841

attack again but i got kind

 

00:03:12.861 --> 00:03:14.043

of upset not just because

 

00:03:14.083 --> 00:03:15.204

it was christmas eve but

 

00:03:15.504 --> 00:03:16.426

what i really got upset

 

00:03:16.466 --> 00:03:17.427

about was that we had

 

00:03:17.467 --> 00:03:18.728

worked so hard to create

 

00:03:18.768 --> 00:03:20.750

this network and do good

 

00:03:20.791 --> 00:03:21.672

things with it for our

 

00:03:21.752 --> 00:03:22.913

company of our customers

 

00:03:23.273 --> 00:03:24.074

and then here were these

 

00:03:24.154 --> 00:03:25.296

attackers that were coming

 

00:03:25.416 --> 00:03:26.958

in and disabling us from

 

00:03:26.998 --> 00:03:28.059

taking you know the benefit

 

00:03:28.099 --> 00:03:29.200

of using our network and i was like

 

00:03:30.081 --> 00:03:30.741

Never again.

 

00:03:30.761 --> 00:03:31.762

That's not okay.

 

00:03:33.002 --> 00:03:34.022

This isn't your network.

 

00:03:34.102 --> 00:03:34.522

It's ours.

 

00:03:34.562 --> 00:03:35.723

You shouldn't be trying to

 

00:03:36.103 --> 00:03:36.783

take and break it.

 

00:03:37.503 --> 00:03:38.804

So that really is what gets

 

00:03:38.844 --> 00:03:40.264

me up every day and keeps

 

00:03:40.284 --> 00:03:41.524

me motivated is that I

 

00:03:41.764 --> 00:03:43.005

really do want to make sure

 

00:03:43.065 --> 00:03:44.725

that people can get all of

 

00:03:44.785 --> 00:03:47.066

the wonderful benefits of technology.

 

00:03:48.791 --> 00:03:51.953

in as safe and reliable a way as possible.

 

00:03:52.033 --> 00:03:52.413

Obviously,

 

00:03:52.433 --> 00:03:53.294

there's no such thing as

 

00:03:53.394 --> 00:03:54.215

absolute security.

 

00:03:54.695 --> 00:03:55.776

And that is partly why we

 

00:03:55.796 --> 00:03:56.936

have to do our jobs every day,

 

00:03:56.976 --> 00:03:57.717

but to make sure that

 

00:03:57.737 --> 00:03:59.998

people have as low risk as

 

00:04:00.098 --> 00:04:01.239

possible as they're getting

 

00:04:01.259 --> 00:04:02.520

the benefits of technology.

 

00:04:03.100 --> 00:04:03.660

Absolutely.

 

00:04:03.680 --> 00:04:05.722

Technology can be used as a

 

00:04:05.762 --> 00:04:06.802

tool or a weapon.

 

00:04:06.902 --> 00:04:09.404

So I love to hear that there

 

00:04:09.464 --> 00:04:11.305

are people working to keep

 

00:04:12.222 --> 00:04:13.783

it as a tool and prevent

 

00:04:13.823 --> 00:04:15.445

others from using it as a weapon.

 

00:04:15.465 --> 00:04:18.387

At Protect AI, you are the CISO.

 

00:04:18.487 --> 00:04:19.448

Is there anything you can

 

00:04:19.528 --> 00:04:21.530

share about what you're working on there?

 

00:04:22.799 --> 00:04:23.399

Yeah, I mean,

 

00:04:23.640 --> 00:04:25.121

the company itself is we're

 

00:04:25.141 --> 00:04:26.662

building out a platform for

 

00:04:26.722 --> 00:04:28.063

building security into the

 

00:04:28.103 --> 00:04:29.063

machine learning and

 

00:04:29.104 --> 00:04:30.484

security operations lifecycle.

 

00:04:30.525 --> 00:04:31.825

And a lot of times people will say,

 

00:04:31.865 --> 00:04:33.407

but it's machine learning AI.

 

00:04:33.907 --> 00:04:35.128

And a lot of generative AI

 

00:04:35.168 --> 00:04:36.769

is actually driven by machine learning.

 

00:04:36.809 --> 00:04:38.370

So as we secure the machine

 

00:04:38.390 --> 00:04:39.211

learning lifecycle,

 

00:04:39.491 --> 00:04:41.312

we create security within

 

00:04:41.672 --> 00:04:42.593

all of the different ways

 

00:04:42.633 --> 00:04:44.154

that those models are used,

 

00:04:44.254 --> 00:04:46.156

including generative AI.

 

00:04:46.236 --> 00:04:48.577

We also have tools that will

 

00:04:48.617 --> 00:04:50.259

help to protect

 

00:04:50.959 --> 00:04:52.520

interactions with generative

 

00:04:52.620 --> 00:04:54.901

AI once it's in production and in use,

 

00:04:55.721 --> 00:04:57.702

both from the person using it,

 

00:04:57.782 --> 00:04:58.703

making sure they're not

 

00:04:58.843 --> 00:05:00.564

asking too much of that

 

00:05:00.624 --> 00:05:02.184

system or giving it

 

00:05:02.224 --> 00:05:03.425

sensitive data that they shouldn't,

 

00:05:03.445 --> 00:05:03.945

for example,

 

00:05:04.325 --> 00:05:05.466

and also making sure that

 

00:05:05.506 --> 00:05:07.327

they're not attacking the

 

00:05:07.367 --> 00:05:08.848

system by getting the

 

00:05:08.888 --> 00:05:10.328

system to do or say

 

00:05:10.388 --> 00:05:11.529

something that it shouldn't.

 

00:05:12.529 --> 00:05:13.510

So there's, you know,

 

00:05:13.750 --> 00:05:14.891

but this all begins at the

 

00:05:14.931 --> 00:05:16.052

beginning of the lifecycle

 

00:05:16.092 --> 00:05:17.493

with our machine learning,

 

00:05:17.533 --> 00:05:18.693

with what models are we

 

00:05:18.733 --> 00:05:19.594

going to download?

 

00:05:19.834 --> 00:05:20.675

What data are we going to

 

00:05:20.715 --> 00:05:21.835

use to train those models?

 

00:05:22.136 --> 00:05:23.156

How do we train those models?

 

00:05:23.196 --> 00:05:24.017

Who trains those models?

 

00:05:24.037 --> 00:05:24.857

How do we label that

 

00:05:24.897 --> 00:05:26.258

training data if we are doing,

 

00:05:26.338 --> 00:05:27.619

you're doing supervised machine learning?

 

00:05:27.919 --> 00:05:29.100

So all of those pieces,

 

00:05:29.140 --> 00:05:30.381

and then we create a

 

00:05:30.441 --> 00:05:31.662

platform that enables you

 

00:05:31.702 --> 00:05:34.284

from the beginning within the lifecycle,

 

00:05:34.404 --> 00:05:34.644

you know,

 

00:05:34.684 --> 00:05:36.365

to create a machine learning

 

00:05:37.125 --> 00:05:37.966

bill of materials.

 

00:05:38.426 --> 00:05:39.547

And do software composition

 

00:05:39.607 --> 00:05:40.728

analysis for your machine

 

00:05:40.768 --> 00:05:41.769

learning models all the way

 

00:05:41.869 --> 00:05:42.950

out to when you're in

 

00:05:42.990 --> 00:05:44.711

production and using generative AI.

 

00:05:45.432 --> 00:05:47.214

What are one of the biggest

 

00:05:47.914 --> 00:05:48.975

misconceptions that you

 

00:05:49.015 --> 00:05:50.917

have seen or heard in

 

00:05:50.977 --> 00:05:53.679

regards to AI and what

 

00:05:53.719 --> 00:05:54.820

you're specifically working

 

00:05:54.900 --> 00:05:55.961

on in the space that you're in?

 

00:05:57.253 --> 00:05:57.393

Well,

 

00:05:57.413 --> 00:06:00.096

the biggest misconception I hear is

 

00:06:00.356 --> 00:06:03.139

that machine learning is, or AI,

 

00:06:03.319 --> 00:06:05.782

is so incredibly powerful

 

00:06:05.962 --> 00:06:07.224

and intelligent that it's

 

00:06:07.304 --> 00:06:08.525

going to be able to be

 

00:06:08.585 --> 00:06:09.966

powerful and intelligent in

 

00:06:10.127 --> 00:06:11.268

all different dimensions.

 

00:06:12.046 --> 00:06:15.589

of knowledge and that's not

 

00:06:15.669 --> 00:06:16.670

exactly the case.

 

00:06:17.650 --> 00:06:21.173

If we're talking about mathematics,

 

00:06:22.935 --> 00:06:23.835

computers are going to be

 

00:06:23.875 --> 00:06:24.916

able to do things that are

 

00:06:24.936 --> 00:06:27.138

better than people in a lot of cases.

 

00:06:27.238 --> 00:06:27.858

For example,

 

00:06:28.179 --> 00:06:29.540

if I gave you two 20-digit

 

00:06:29.580 --> 00:06:30.921

numbers right now to multiply,

 

00:06:31.621 --> 00:06:32.962

or gave me, right?

 

00:06:33.022 --> 00:06:34.183

Both of us would take quite

 

00:06:34.203 --> 00:06:35.244

a while with pen and paper

 

00:06:35.264 --> 00:06:36.265

trying to do that manually.

 

00:06:36.305 --> 00:06:37.306

A calculator is going to be

 

00:06:37.346 --> 00:06:38.947

able to do that very, very quickly.

 

00:06:39.508 --> 00:06:41.329

And if it's been tested and

 

00:06:41.349 --> 00:06:42.190

deployed properly,

 

00:06:42.270 --> 00:06:43.471

it's going to be able to do

 

00:06:43.511 --> 00:06:45.192

it with a very high level of accuracy.

 

00:06:45.552 --> 00:06:46.633

Human beings aren't also

 

00:06:46.713 --> 00:06:48.054

that great sometimes with our math,

 

00:06:48.114 --> 00:06:48.314

right?

 

00:06:48.355 --> 00:06:48.855

Check the math.

 

00:06:50.956 --> 00:06:52.017

But there are other things

 

00:06:52.037 --> 00:06:52.997

that humans do that we

 

00:06:53.037 --> 00:06:54.618

don't necessarily count as

 

00:06:54.698 --> 00:06:55.779

part of intelligence,

 

00:06:56.199 --> 00:06:58.000

but actually do take quite

 

00:06:58.080 --> 00:06:59.500

a lot of intelligence.

 

00:07:00.061 --> 00:07:02.442

So I use this example a lot,

 

00:07:03.002 --> 00:07:03.822

which is to pick up

 

00:07:03.882 --> 00:07:05.183

something and pick up a

 

00:07:05.223 --> 00:07:06.744

vessel and drink from it.

 

00:07:07.804 --> 00:07:09.625

This actually, to be able to pick this up,

 

00:07:09.685 --> 00:07:10.945

I mean, this could have been in a glass.

 

00:07:11.025 --> 00:07:12.065

I use glasses a lot.

 

00:07:12.105 --> 00:07:13.446

Sometimes I pour this in.

 

00:07:13.826 --> 00:07:15.546

The amount of liquid that's in here,

 

00:07:15.586 --> 00:07:17.167

I actually can't see this right now.

 

00:07:17.507 --> 00:07:18.527

This could be full or it

 

00:07:18.567 --> 00:07:19.468

could be almost empty.

 

00:07:19.808 --> 00:07:20.548

If it was full,

 

00:07:21.028 --> 00:07:22.609

I need to create a

 

00:07:22.669 --> 00:07:24.109

different amount of grip strength.

 

00:07:24.469 --> 00:07:25.409

I need to put a different

 

00:07:25.469 --> 00:07:26.810

amount of force in my arm.

 

00:07:27.050 --> 00:07:28.450

There's a whole lot, bottom line,

 

00:07:28.490 --> 00:07:29.931

there's a whole lot of math

 

00:07:30.051 --> 00:07:32.372

that goes on with something very simple,

 

00:07:32.412 --> 00:07:33.612

but because human beings do

 

00:07:33.652 --> 00:07:35.753

this every day without thinking, we go,

 

00:07:36.133 --> 00:07:36.833

eh, that's easy.

 

00:07:37.233 --> 00:07:38.977

Well, it's not necessarily easy.

 

00:07:39.037 --> 00:07:40.319

Now think about training an

 

00:07:40.740 --> 00:07:42.464

AI-driven robot to be able

 

00:07:42.504 --> 00:07:44.488

to pick up any kind of

 

00:07:45.069 --> 00:07:46.873

vessel and know the exact

 

00:07:46.913 --> 00:07:47.734

amount of strength to

 

00:07:48.692 --> 00:07:51.374

to bring it up to drink, right?

 

00:07:51.875 --> 00:07:52.696

So that that's it,

 

00:07:52.796 --> 00:07:54.317

is that we don't

 

00:07:54.417 --> 00:07:56.699

necessarily get down to the

 

00:07:56.759 --> 00:07:58.200

levels of what intelligence is.

 

00:07:58.240 --> 00:07:58.721

And sometimes we

 

00:07:58.961 --> 00:08:00.442

overestimate how hard something is,

 

00:08:00.462 --> 00:08:02.064

because it's hard for us, i.e.

 

00:08:02.084 --> 00:08:02.424

man.

 

00:08:02.864 --> 00:08:03.865

And we underestimate

 

00:08:03.905 --> 00:08:04.646

something that's actually

 

00:08:04.666 --> 00:08:05.367

pretty difficult,

 

00:08:05.387 --> 00:08:06.348

because it's easy for us,

 

00:08:06.408 --> 00:08:07.929

like picking up a glass and

 

00:08:07.969 --> 00:08:09.210

being able to bring it to

 

00:08:09.250 --> 00:08:10.611

our mouths to be able to drink from it.

 

00:08:11.692 --> 00:08:13.954

So that misconception has led to some,

 

00:08:14.295 --> 00:08:15.616

I've heard things like, well, you know,

 

00:08:16.799 --> 00:08:19.200

OpenAI, GenA, ChatGPT,

 

00:08:19.901 --> 00:08:21.421

it passed the bar and it's

 

00:08:21.441 --> 00:08:22.482

going to be smarter than

 

00:08:22.522 --> 00:08:23.942

lawyers within six months.

 

00:08:25.603 --> 00:08:27.684

Not necessarily because the

 

00:08:27.864 --> 00:08:29.125

kinds of intelligence that

 

00:08:29.185 --> 00:08:30.986

you need to have to be a lawyer isn't

 

00:08:32.186 --> 00:08:33.887

isn't exactly the way that

 

00:08:33.927 --> 00:08:35.828

ChatGPT is working yet.

 

00:08:35.988 --> 00:08:36.848

One of the big problems with

 

00:08:36.868 --> 00:08:37.949

ChatGPT in the legal

 

00:08:37.969 --> 00:08:40.730

profession so far has been the, quote,

 

00:08:40.810 --> 00:08:42.511

hallucinations create

 

00:08:42.551 --> 00:08:44.291

statistically probable

 

00:08:44.852 --> 00:08:47.133

cases that could have been

 

00:08:47.793 --> 00:08:48.553

built a precedent.

 

00:08:49.534 --> 00:08:51.054

But it doesn't necessarily

 

00:08:51.094 --> 00:08:52.075

go and check to see if

 

00:08:52.135 --> 00:08:54.576

those cases are actual cases.

 

00:08:54.636 --> 00:08:56.236

Maybe it just invented some

 

00:08:56.276 --> 00:08:56.997

of those cases.

 

00:08:58.337 --> 00:08:59.578

So that misconception,

 

00:08:59.598 --> 00:09:00.539

the misconception that the

 

00:09:00.659 --> 00:09:02.381

AI is always smarter than

 

00:09:02.481 --> 00:09:04.162

us and that it's always

 

00:09:04.222 --> 00:09:05.444

going to be better than humans.

 

00:09:05.544 --> 00:09:06.705

I think that that's a really

 

00:09:07.125 --> 00:09:08.627

that's a big one that we

 

00:09:08.707 --> 00:09:10.428

just as as all of us need

 

00:09:10.448 --> 00:09:12.330

to be thinking about what

 

00:09:12.610 --> 00:09:13.771

where it's really smarter,

 

00:09:13.831 --> 00:09:15.112

what the best use cases are

 

00:09:15.253 --> 00:09:16.494

and what where we really

 

00:09:16.534 --> 00:09:17.695

need to have humans still

 

00:09:17.755 --> 00:09:18.976

involved with the thinking and nuance.

 

00:09:20.177 --> 00:09:21.398

The other big misconception

 

00:09:21.958 --> 00:09:23.418

that is concerning and one

 

00:09:23.458 --> 00:09:25.179

that at Protect AI we're

 

00:09:25.219 --> 00:09:27.600

very much hoping to help with quite a bit,

 

00:09:28.220 --> 00:09:30.541

which is that some people

 

00:09:30.601 --> 00:09:33.122

kind of think that AI just

 

00:09:33.202 --> 00:09:35.263

got invented in November

 

00:09:35.263 --> 00:09:36.603

2022 and that ChatGPT was

 

00:09:36.643 --> 00:09:38.364

the first real deployment of AI.

 

00:09:38.724 --> 00:09:39.705

That's not the case.

 

00:09:39.785 --> 00:09:41.065

I had mentioned you being at

 

00:09:41.145 --> 00:09:42.506

IBM when Watson for Cyber

 

00:09:42.546 --> 00:09:43.386

was being trained.

 

00:09:43.786 --> 00:09:45.347

And that was eight years ago.

 

00:09:46.307 --> 00:09:47.448

Machine learning models have

 

00:09:47.488 --> 00:09:49.088

been in use in a number of

 

00:09:49.148 --> 00:09:50.889

different sectors and organizations.

 

00:09:51.229 --> 00:09:52.630

Big Pharma, for example,

 

00:09:52.690 --> 00:09:54.311

for trials and looking at

 

00:09:54.371 --> 00:09:57.192

data around new medications, for example,

 

00:09:57.292 --> 00:09:58.413

and financial services.

 

00:09:58.793 --> 00:09:59.954

Some of us may already be

 

00:10:00.134 --> 00:10:01.995

using ML within our

 

00:10:02.055 --> 00:10:02.875

financial services

 

00:10:02.915 --> 00:10:04.556

portfolio if anybody signed

 

00:10:04.576 --> 00:10:07.057

up for a quantitative or robo advisor,

 

00:10:07.077 --> 00:10:07.377

right?

 

00:10:07.397 --> 00:10:08.438

That's machine learning driven.

 

00:10:09.218 --> 00:10:10.619

So the misconception that

 

00:10:10.679 --> 00:10:12.540

this just happened and now

 

00:10:12.580 --> 00:10:13.640

we have to think about how

 

00:10:13.680 --> 00:10:15.081

we secure it going forward.

 

00:10:16.153 --> 00:10:16.713

It's here.

 

00:10:16.894 --> 00:10:18.355

It's been in use in many,

 

00:10:18.395 --> 00:10:19.535

many organizations and

 

00:10:19.595 --> 00:10:21.957

sectors for a significant amount of time.

 

00:10:22.057 --> 00:10:23.058

And the most important thing

 

00:10:23.118 --> 00:10:24.759

is to understand that we

 

00:10:24.799 --> 00:10:26.881

need to build security into

 

00:10:26.941 --> 00:10:28.762

that lifecycle the same way

 

00:10:28.802 --> 00:10:29.863

that we've built it into

 

00:10:29.983 --> 00:10:31.604

our other security lifecycle.

 

00:10:31.664 --> 00:10:33.445

So if you're thinking about DevSecOps,

 

00:10:33.485 --> 00:10:34.466

build security into your

 

00:10:34.486 --> 00:10:35.386

development lifecycle.

 

00:10:36.007 --> 00:10:36.827

MLSecOps,

 

00:10:37.128 --> 00:10:38.509

build security into your machine

 

00:10:38.549 --> 00:10:39.429

learning lifecycle.

 

00:10:39.750 --> 00:10:39.970

Great.

 

00:10:40.630 --> 00:10:42.631

I think that with the

 

00:10:42.651 --> 00:10:44.111

surprise everybody had with

 

00:10:44.251 --> 00:10:46.171

AI when it exploded last year,

 

00:10:46.632 --> 00:10:47.832

my thoughts did go to, well,

 

00:10:47.872 --> 00:10:49.772

we have had ML and a lot of

 

00:10:49.812 --> 00:10:51.253

people don't realize just

 

00:10:51.613 --> 00:10:53.273

we have had ML and what

 

00:10:53.313 --> 00:10:55.174

that has been built into.

 

00:10:55.194 --> 00:10:58.375

And with AI, like, yes, it is smart,

 

00:10:58.595 --> 00:10:59.655

but it doesn't have the

 

00:10:59.695 --> 00:11:00.975

contextual knowledge that

 

00:11:01.055 --> 00:11:03.356

humans have from our own experiences.

 

00:11:04.102 --> 00:11:05.884

And that I think is an area

 

00:11:05.924 --> 00:11:08.446

that AI would need to train or learn in.

 

00:11:08.666 --> 00:11:12.329

And that's a very nuanced

 

00:11:12.429 --> 00:11:13.690

thing for it to be able to

 

00:11:13.730 --> 00:11:15.051

do as a machine.

 

00:11:16.672 --> 00:11:18.333

What have you learned,

 

00:11:18.754 --> 00:11:20.095

something new that you have

 

00:11:20.155 --> 00:11:22.136

learned in this field now

 

00:11:22.176 --> 00:11:25.039

that AI has been such a big

 

00:11:25.119 --> 00:11:26.560

piece of this now?

 

00:11:27.637 --> 00:11:27.857

Well,

 

00:11:27.957 --> 00:11:29.799

I've certainly been on a journey of

 

00:11:30.120 --> 00:11:31.901

learning a lot more about

 

00:11:31.982 --> 00:11:32.902

how machine learning

 

00:11:32.983 --> 00:11:35.205

engineers and data scientists work.

 

00:11:35.914 --> 00:11:38.497

because I came to it from security.

 

00:11:38.557 --> 00:11:40.559

So I brought all my security knowledge in,

 

00:11:40.859 --> 00:11:42.781

but I wasn't a deep machine learning.

 

00:11:42.841 --> 00:11:44.723

I'm not an ML engineer myself.

 

00:11:45.244 --> 00:11:46.685

So it's really learning

 

00:11:46.745 --> 00:11:49.208

about data and how

 

00:11:49.308 --> 00:11:50.869

different data is in the

 

00:11:50.910 --> 00:11:52.651

machine learning and AI world,

 

00:11:52.972 --> 00:11:55.114

how different it is thinking about data.

 

00:11:55.614 --> 00:11:56.195

For example,

 

00:11:56.255 --> 00:11:57.536

if you're training a machine

 

00:11:57.576 --> 00:11:58.117

learning model,

 

00:11:58.470 --> 00:12:00.131

you may well want to train

 

00:12:00.191 --> 00:12:02.132

it on real live data.

 

00:12:02.772 --> 00:12:04.033

It may be anonymized.

 

00:12:04.053 --> 00:12:04.273

I mean,

 

00:12:04.734 --> 00:12:06.214

you may give instructions to the

 

00:12:06.254 --> 00:12:07.355

model to not give out

 

00:12:07.415 --> 00:12:08.716

information about the training data,

 

00:12:08.736 --> 00:12:10.056

but you want to train it on

 

00:12:10.097 --> 00:12:12.298

live data very often so

 

00:12:12.338 --> 00:12:13.478

that you know that it's

 

00:12:13.498 --> 00:12:15.840

been fit properly for its purpose.

 

00:12:16.200 --> 00:12:17.261

Whereas when you're thinking

 

00:12:17.321 --> 00:12:18.501

about software development,

 

00:12:19.102 --> 00:12:20.542

and I've been doing this

 

00:12:20.602 --> 00:12:23.744

for a long time with secure life cycles,

 

00:12:26.378 --> 00:12:28.960

With this development, as you look at that,

 

00:12:29.040 --> 00:12:31.081

we were always about don't

 

00:12:31.181 --> 00:12:32.982

let the real data out.

 

00:12:33.022 --> 00:12:34.123

Don't put production data

 

00:12:34.203 --> 00:12:35.284

into your training data

 

00:12:35.324 --> 00:12:36.224

that you wanted to always

 

00:12:36.264 --> 00:12:37.585

have the separation of the

 

00:12:37.645 --> 00:12:39.206

data and a machine learning.

 

00:12:39.266 --> 00:12:42.348

It's a very different world with the data.

 

00:12:43.229 --> 00:12:44.129

Has there been any

 

00:12:44.269 --> 00:12:47.391

surprising lessons or key

 

00:12:47.492 --> 00:12:48.692

takeaways that you have

 

00:12:49.233 --> 00:12:51.294

gleaned over this process?

 

00:12:52.737 --> 00:12:53.197

Yeah, I mean,

 

00:12:53.237 --> 00:12:54.578

I think as we're looking at

 

00:12:54.758 --> 00:12:55.939

all of the great research

 

00:12:55.979 --> 00:12:58.160

that's being done by the community,

 

00:12:58.921 --> 00:13:00.862

there have been a series of lessons.

 

00:13:00.922 --> 00:13:02.463

A big lesson was in the

 

00:13:02.503 --> 00:13:04.064

supply chain of machine

 

00:13:04.084 --> 00:13:06.185

learning itself that we need to,

 

00:13:06.265 --> 00:13:06.925

of course,

 

00:13:07.426 --> 00:13:09.287

worry about the data and the

 

00:13:09.347 --> 00:13:10.808

models in the machine learning.

 

00:13:10.828 --> 00:13:11.708

But the machine learning is

 

00:13:11.868 --> 00:13:14.610

running on top of a platform very often.

 

00:13:15.429 --> 00:13:17.391

or a management tool for workflow.

 

00:13:17.671 --> 00:13:18.591

So we have to also make sure

 

00:13:18.631 --> 00:13:20.473

that that supply chain is secure.

 

00:13:20.513 --> 00:13:22.474

We can't just say, oh,

 

00:13:22.834 --> 00:13:24.535

machine learning's just in

 

00:13:24.595 --> 00:13:26.156

its own little bubble and it's separate.

 

00:13:26.537 --> 00:13:28.438

It's running in an environment.

 

00:13:28.458 --> 00:13:29.579

So we also have to make sure

 

00:13:29.599 --> 00:13:30.599

that we've secured that.

 

00:13:31.020 --> 00:13:31.920

So that's been a big

 

00:13:31.940 --> 00:13:33.021

eye-opener for me about

 

00:13:35.203 --> 00:13:37.385

how we have to apply the security,

 

00:13:37.645 --> 00:13:38.646

not just to the machine

 

00:13:38.666 --> 00:13:40.107

learning operations lifecycle,

 

00:13:40.127 --> 00:13:42.289

but also to the environment

 

00:13:42.329 --> 00:13:43.230

that the machine learning,

 

00:13:43.750 --> 00:13:45.251

the models are going to be running in.

 

00:13:46.853 --> 00:13:47.513

What are some of the

 

00:13:47.573 --> 00:13:49.575

challenges that you think

 

00:13:49.635 --> 00:13:51.076

we will see as we move

 

00:13:51.156 --> 00:13:51.917

forward and begin

 

00:13:52.017 --> 00:13:53.638

implementing AI into our

 

00:13:54.058 --> 00:13:55.119

different programs,

 

00:13:55.139 --> 00:13:56.520

and especially for security

 

00:13:56.581 --> 00:13:58.262

programs from that perspective?

 

00:13:59.292 --> 00:14:00.293

I think the first big

 

00:14:00.313 --> 00:14:03.417

challenge is just going to

 

00:14:03.437 --> 00:14:05.158

be able to weave security

 

00:14:05.299 --> 00:14:08.082

in to the entire life cycle.

 

00:14:08.762 --> 00:14:10.684

Being able to get governance

 

00:14:11.105 --> 00:14:12.106

and be able to see, know,

 

00:14:12.146 --> 00:14:13.247

and manage what's happening

 

00:14:13.287 --> 00:14:14.729

in our machine learning life cycles.

 

00:14:15.174 --> 00:14:16.135

starting from where those

 

00:14:16.175 --> 00:14:17.456

models are coming from as

 

00:14:17.496 --> 00:14:18.197

sort of a software

 

00:14:18.237 --> 00:14:19.978

composition analysis around

 

00:14:19.998 --> 00:14:21.059

the models and create a

 

00:14:21.079 --> 00:14:21.760

bill of materials.

 

00:14:22.400 --> 00:14:23.642

What models are we using?

 

00:14:23.882 --> 00:14:24.682

Who was training them?

 

00:14:24.823 --> 00:14:25.944

What data are we using?

 

00:14:26.364 --> 00:14:27.645

Where have those data sets

 

00:14:27.685 --> 00:14:29.166

been used within our environment?

 

00:14:29.186 --> 00:14:30.027

So I think that's a really

 

00:14:30.087 --> 00:14:31.849

big challenge is to weave

 

00:14:31.889 --> 00:14:33.710

the security into the process.

 

00:14:35.072 --> 00:14:36.113

What about wins?

 

00:14:37.294 --> 00:14:38.815

Has it helped in any way as

 

00:14:38.875 --> 00:14:42.158

far as implementing security programs?

 

00:14:43.422 --> 00:14:44.482

I think that there's a lot

 

00:14:44.502 --> 00:14:45.843

of opportunity for big wins.

 

00:14:45.863 --> 00:14:47.544

We've already seen machine

 

00:14:47.584 --> 00:14:48.744

learning help us quite a

 

00:14:48.804 --> 00:14:50.925

bit in anomaly detection

 

00:14:51.365 --> 00:14:52.866

and in classification.

 

00:14:53.026 --> 00:14:54.426

Is this fish, or is this not a fish?

 

00:14:54.487 --> 00:14:55.107

Is it perfect?

 

00:14:55.427 --> 00:14:55.547

No.

 

00:14:55.987 --> 00:14:59.949

But already, most of us, if you're using

 

00:15:01.109 --> 00:15:01.870

email from one of the big

 

00:15:01.950 --> 00:15:03.472

providers or Microsoft or Google,

 

00:15:03.712 --> 00:15:04.853

machine learning is helping

 

00:15:04.913 --> 00:15:06.615

to make sure that the

 

00:15:06.855 --> 00:15:08.457

emails that get to you are

 

00:15:08.497 --> 00:15:09.337

legitimate emails.

 

00:15:09.418 --> 00:15:10.539

Again, not perfect,

 

00:15:10.739 --> 00:15:12.280

but certainly giving you a

 

00:15:12.340 --> 00:15:14.122

leg up and reducing the

 

00:15:14.182 --> 00:15:15.644

noise as much as possible.

 

00:15:15.724 --> 00:15:17.545

So that's been a nice win.

 

00:15:17.645 --> 00:15:19.167

Also looking at the

 

00:15:19.227 --> 00:15:21.449

anomalies and deviations from the norm,

 

00:15:21.469 --> 00:15:21.990

because that's where

 

00:15:22.030 --> 00:15:22.871

something machine learning

 

00:15:22.891 --> 00:15:24.692

is really good at pattern matching.

 

00:15:24.712 --> 00:15:25.293

Yeah.

 

00:15:25.613 --> 00:15:27.474

and finding patterns and very,

 

00:15:27.514 --> 00:15:29.234

very vast amounts of data.

 

00:15:29.614 --> 00:15:31.655

So a human being seeing that

 

00:15:31.695 --> 00:15:32.875

something looks anomalous

 

00:15:32.935 --> 00:15:33.816

or off the baseline,

 

00:15:34.376 --> 00:15:35.476

if you've got thousands or

 

00:15:35.476 --> 00:15:37.157

100,000 users at your company,

 

00:15:37.177 --> 00:15:38.077

that could be really hard

 

00:15:38.117 --> 00:15:39.757

for a human to detect.

 

00:15:39.998 --> 00:15:40.678

But that's something where

 

00:15:40.698 --> 00:15:42.338

machine learning can be very,

 

00:15:42.398 --> 00:15:43.419

very good at showing you

 

00:15:43.459 --> 00:15:44.339

those little bits.

 

00:15:44.839 --> 00:15:46.581

The anomalous flag or alert

 

00:15:46.661 --> 00:15:47.922

may not mean that something

 

00:15:47.982 --> 00:15:49.163

nefarious is happening in

 

00:15:49.203 --> 00:15:50.024

your organization,

 

00:15:50.404 --> 00:15:51.725

but it gives you a good

 

00:15:51.865 --> 00:15:53.146

view into where those

 

00:15:53.247 --> 00:15:54.468

analysts can look to see,

 

00:15:54.848 --> 00:15:56.209

is this something that we expected?

 

00:15:56.289 --> 00:15:57.951

Is this a human being a human,

 

00:15:58.311 --> 00:15:59.532

or is this a human being a

 

00:15:59.592 --> 00:16:00.293

malicious human?

 

00:16:00.773 --> 00:16:03.856

Right, right.

 

00:16:03.876 --> 00:16:04.957

What are you most excited

 

00:16:05.017 --> 00:16:06.498

about at the intersection

 

00:16:06.538 --> 00:16:07.960

of AI and security?

 

00:16:09.275 --> 00:16:10.536

I am very excited about the

 

00:16:10.596 --> 00:16:12.258

ability to look at the vast

 

00:16:12.298 --> 00:16:13.539

amounts of data and start

 

00:16:13.559 --> 00:16:14.800

to see perhaps patterns

 

00:16:14.840 --> 00:16:15.821

that we hadn't been able to

 

00:16:15.861 --> 00:16:17.363

see before about attacks

 

00:16:17.483 --> 00:16:18.784

and also to help us

 

00:16:18.864 --> 00:16:19.805

understand again that

 

00:16:19.885 --> 00:16:21.587

deviation from the norm.

 

00:16:21.627 --> 00:16:22.568

I think that those are two

 

00:16:22.708 --> 00:16:23.609

really wonderful

 

00:16:23.809 --> 00:16:24.870

applications and we're

 

00:16:24.890 --> 00:16:26.632

seeing some nice advances there.

 

00:16:27.553 --> 00:16:29.574

There's also potentially

 

00:16:29.755 --> 00:16:30.655

getting a little bit more

 

00:16:30.836 --> 00:16:33.358

agency and response on, you know,

 

00:16:33.458 --> 00:16:35.300

if something looks wrong,

 

00:16:36.060 --> 00:16:37.802

what action do we take in response?

 

00:16:37.862 --> 00:16:40.104

I think that might be, you know,

 

00:16:40.224 --> 00:16:41.245

as we get better, we train,

 

00:16:41.265 --> 00:16:42.446

we go in observation mode,

 

00:16:42.506 --> 00:16:43.467

will we be able to get a

 

00:16:43.487 --> 00:16:44.488

little bit more proactive?

 

00:16:44.868 --> 00:16:46.269

That's fairly exciting.

 

00:16:46.389 --> 00:16:47.110

And then the other thing is

 

00:16:47.150 --> 00:16:47.771

around training.

 

00:16:48.411 --> 00:16:49.532

Because some of these tools

 

00:16:49.592 --> 00:16:51.294

can really make it easier.

 

00:16:51.334 --> 00:16:52.795

They emulate what it is to

 

00:16:52.835 --> 00:16:53.856

be talking to a person.

 

00:16:53.896 --> 00:16:54.697

So give us some good

 

00:16:54.717 --> 00:16:55.938

opportunity for training

 

00:16:55.978 --> 00:16:57.360

and improving that training.

 

00:16:58.321 --> 00:17:01.043

So you've been in cybersecurity for,

 

00:17:01.143 --> 00:17:03.125

as you said, across three decades.

 

00:17:03.846 --> 00:17:05.708

What has been one of the

 

00:17:05.768 --> 00:17:06.869

greatest pieces of advice

 

00:17:06.889 --> 00:17:08.090

that maybe a mentor has

 

00:17:08.130 --> 00:17:10.833

given you that has helped

 

00:17:10.873 --> 00:17:11.874

you throughout your career?

 

00:17:14.803 --> 00:17:17.486

Just believing that you can

 

00:17:17.566 --> 00:17:19.207

do it is actually really

 

00:17:19.728 --> 00:17:21.309

one of my very first mentors.

 

00:17:22.090 --> 00:17:23.951

I was working in editorial

 

00:17:24.172 --> 00:17:25.953

and my love of systems,

 

00:17:26.033 --> 00:17:27.054

I had started to do

 

00:17:27.174 --> 00:17:29.016

acquisitions of software

 

00:17:29.056 --> 00:17:30.077

for the textbooks that we

 

00:17:30.117 --> 00:17:30.738

were publishing.

 

00:17:31.478 --> 00:17:32.699

And the person that hired me

 

00:17:32.739 --> 00:17:33.880

into my first IT role,

 

00:17:33.940 --> 00:17:35.362

she saw me and that skill

 

00:17:35.502 --> 00:17:36.963

and thought I could do it.

 

00:17:36.983 --> 00:17:38.264

And I was like, I don't know.

 

00:17:38.284 --> 00:17:40.646

I think I have to have more tech or this.

 

00:17:40.686 --> 00:17:42.288

And she said, no, you've got it.

 

00:17:42.328 --> 00:17:43.328

You've already got that.

 

00:17:43.469 --> 00:17:44.570

And I've heard that from

 

00:17:44.670 --> 00:17:46.091

other people in my career.

 

00:17:46.131 --> 00:17:47.372

There's a sense of that

 

00:17:47.432 --> 00:17:48.673

somehow you're going to get

 

00:17:48.753 --> 00:17:51.235

to a level suddenly and then you're that.

 

00:17:51.455 --> 00:17:51.996

But it's

 

00:17:52.236 --> 00:17:52.416

No,

 

00:17:52.636 --> 00:17:54.298

you go out and you do it and you build

 

00:17:54.338 --> 00:17:55.338

those skills and nobody

 

00:17:55.378 --> 00:17:56.399

comes along and says, oh,

 

00:17:56.880 --> 00:17:58.321

now you're a threat hunter

 

00:17:58.341 --> 00:17:59.942

or now you're a SOC analyst.

 

00:17:59.962 --> 00:18:01.043

You might get that title.

 

00:18:01.503 --> 00:18:02.904

But, you know,

 

00:18:03.044 --> 00:18:03.965

it's really it's the work

 

00:18:04.005 --> 00:18:06.527

that you've done to get into that role.

 

00:18:06.607 --> 00:18:07.928

So I think just really

 

00:18:08.308 --> 00:18:10.150

understanding and having

 

00:18:10.230 --> 00:18:11.691

faith in yourself, do the work,

 

00:18:12.151 --> 00:18:13.412

but then also give yourself

 

00:18:13.492 --> 00:18:15.133

credit that you have done the work, which,

 

00:18:15.734 --> 00:18:15.934

you know,

 

00:18:15.954 --> 00:18:16.815

that's always been a little bit

 

00:18:16.835 --> 00:18:17.255

hard for me.

 

00:18:17.893 --> 00:18:18.073

Yeah,

 

00:18:18.113 --> 00:18:19.854

I think that can be hard for a lot of

 

00:18:19.894 --> 00:18:21.335

people to there's, you know,

 

00:18:21.355 --> 00:18:22.576

we hear about imposter

 

00:18:22.596 --> 00:18:24.338

syndrome and not feeling like, you know,

 

00:18:24.378 --> 00:18:25.979

you're supposed to be here,

 

00:18:26.139 --> 00:18:26.659

but we're here.

 

00:18:26.679 --> 00:18:27.980

We're doing the work.

 

00:18:28.060 --> 00:18:29.261

So yeah.

 

00:18:29.581 --> 00:18:30.782

What advice would you have

 

00:18:30.842 --> 00:18:32.083

to somebody who is looking

 

00:18:32.123 --> 00:18:33.704

at getting into either

 

00:18:33.764 --> 00:18:35.666

cybersecurity or specifically AI?

 

00:18:37.541 --> 00:18:38.341

In both cases,

 

00:18:38.521 --> 00:18:40.122

what is it that you really love to do?

 

00:18:40.342 --> 00:18:41.403

Why do you want to do it?

 

00:18:41.603 --> 00:18:42.503

And I've talked to people

 

00:18:42.563 --> 00:18:44.264

going into security and now

 

00:18:44.284 --> 00:18:45.724

security of AI because it's

 

00:18:45.784 --> 00:18:47.405

the one that's getting a

 

00:18:47.445 --> 00:18:48.486

lot of attention right now.

 

00:18:49.206 --> 00:18:51.407

And I've heard, well,

 

00:18:51.427 --> 00:18:52.887

because I don't want to be unemployed.

 

00:18:53.788 --> 00:18:55.028

I want to make a lot of money.

 

00:18:55.689 --> 00:18:57.029

I want to get the bad guys.

 

00:18:57.129 --> 00:18:58.350

Okay, those are all good things.

 

00:18:58.370 --> 00:18:59.910

Think about why it is that

 

00:18:59.930 --> 00:19:01.411

you're going into this industry.

 

00:19:01.627 --> 00:19:01.987

field,

 

00:19:02.567 --> 00:19:04.488

and then which part of the field

 

00:19:04.548 --> 00:19:05.708

that really interests you.

 

00:19:06.008 --> 00:19:07.609

There are highly technical

 

00:19:07.649 --> 00:19:09.970

roles in security and security for AI,

 

00:19:10.310 --> 00:19:11.570

and there are less technical roles.

 

00:19:11.630 --> 00:19:13.811

We need lawyers, we need graphic designers,

 

00:19:13.871 --> 00:19:15.211

we need sociologists to

 

00:19:15.251 --> 00:19:15.971

help us understand the

 

00:19:15.991 --> 00:19:17.052

psychology of the people

 

00:19:17.072 --> 00:19:18.412

that are responding to these systems.

 

00:19:18.812 --> 00:19:19.812

We need a whole lot of

 

00:19:19.952 --> 00:19:21.193

insight and intelligence

 

00:19:21.353 --> 00:19:22.793

around these problems of

 

00:19:22.853 --> 00:19:24.874

both cyber and cyber and AI.

 

00:19:26.167 --> 00:19:27.309

So what is it you love to do?

 

00:19:27.670 --> 00:19:28.772

Because if it's your passion

 

00:19:28.812 --> 00:19:29.714

and you love to do it,

 

00:19:29.754 --> 00:19:31.818

you're much more likely to succeed in it,

 

00:19:32.419 --> 00:19:33.080

especially because you're

 

00:19:33.100 --> 00:19:34.142

not going to show up every day.

 

00:19:34.523 --> 00:19:35.003

I hate this.

 

00:19:35.023 --> 00:19:36.146

You're going to show up and be like,

 

00:19:36.266 --> 00:19:38.009

I really I want to do this work.

 

00:19:38.892 --> 00:19:41.454

So that's really understand

 

00:19:41.534 --> 00:19:42.534

what it is you want to do,

 

00:19:42.594 --> 00:19:43.775

start to get a feel for

 

00:19:43.895 --> 00:19:45.896

which area is most interesting to you.

 

00:19:46.176 --> 00:19:47.237

And then once you've got that,

 

00:19:47.477 --> 00:19:48.818

you can get better advice

 

00:19:48.878 --> 00:19:50.539

and guidance on exactly what to do.

 

00:19:50.559 --> 00:19:51.379

If somebody says that they

 

00:19:51.419 --> 00:19:52.600

want a really technical path,

 

00:19:52.800 --> 00:19:53.541

they don't like people,

 

00:19:53.821 --> 00:19:54.781

they want to sit with their

 

00:19:54.821 --> 00:19:55.742

computer all the time,

 

00:19:56.022 --> 00:19:58.163

and they just want to look at code, okay,

 

00:19:58.443 --> 00:19:59.344

then they might be really

 

00:19:59.384 --> 00:20:00.605

good as a threat hunter,

 

00:20:00.905 --> 00:20:02.886

they might be good as a malware engineer,

 

00:20:02.966 --> 00:20:04.027

or be able to

 

00:20:07.205 --> 00:20:08.386

to look for vulnerabilities

 

00:20:08.466 --> 00:20:10.208

in other systems, be a red teamer.

 

00:20:10.468 --> 00:20:11.429

There are a lot of things

 

00:20:11.449 --> 00:20:13.410

that might be better for that person.

 

00:20:13.450 --> 00:20:15.132

And then if that's where they wanted to go,

 

00:20:15.152 --> 00:20:15.972

I would say things like

 

00:20:16.273 --> 00:20:16.973

start going to the more

 

00:20:17.013 --> 00:20:19.595

technical conferences, go to,

 

00:20:19.916 --> 00:20:22.198

go to CFPs and compete in CFPs,

 

00:20:22.258 --> 00:20:26.982

capture the, not CFPs, CFTs, CTFs, sorry.

 

00:20:27.182 --> 00:20:27.842

Exactly.

 

00:20:29.224 --> 00:20:30.846

capture that, call for papers,

 

00:20:31.928 --> 00:20:34.612

capture the flag and participate in that.

 

00:20:35.012 --> 00:20:35.774

And then that's going to

 

00:20:35.794 --> 00:20:36.575

start building out a

 

00:20:36.595 --> 00:20:37.656

network and people are going to, oh,

 

00:20:37.716 --> 00:20:39.299

this person has mad skills

 

00:20:39.880 --> 00:20:41.262

doing this kinds of work.

 

00:20:41.722 --> 00:20:43.164

Then that network could help

 

00:20:43.225 --> 00:20:44.166

you find a job.

 

00:20:44.346 --> 00:20:44.647

If you,

 

00:20:45.107 --> 00:20:45.968

As you're sitting and

 

00:20:46.009 --> 00:20:46.790

interrogating yourself,

 

00:20:46.810 --> 00:20:48.432

which part of this I like the most of,

 

00:20:48.492 --> 00:20:48.772

it's like,

 

00:20:48.912 --> 00:20:50.695

I really like the policy and I

 

00:20:50.755 --> 00:20:51.636

want to write the policy,

 

00:20:51.656 --> 00:20:52.797

but I want it to be the

 

00:20:52.837 --> 00:20:54.900

policy that gets adopted up in Washington,

 

00:20:54.940 --> 00:20:55.401

D.C.

 

00:20:55.681 --> 00:20:56.522

that becomes law.

 

00:20:56.903 --> 00:20:57.944

OK, in that case,

 

00:20:57.964 --> 00:20:58.825

you're probably going to be

 

00:20:58.865 --> 00:21:00.447

veering more towards at

 

00:21:00.548 --> 00:21:01.609

least the government and

 

00:21:01.649 --> 00:21:03.191

potentially even legal policy.

 

00:21:03.539 --> 00:21:04.519

so that you can go up on the

 

00:21:04.539 --> 00:21:05.900

hill and understand how to

 

00:21:05.960 --> 00:21:07.580

create laws and what those laws are like.

 

00:21:08.380 --> 00:21:09.741

And that would be a different path.

 

00:21:10.121 --> 00:21:11.481

So it's most important first

 

00:21:11.521 --> 00:21:12.621

to really understand which

 

00:21:12.801 --> 00:21:16.862

area of this practice you want to go into,

 

00:21:17.063 --> 00:21:18.163

and then you can start to

 

00:21:18.363 --> 00:21:20.924

tune your own path forward.

 

00:21:21.244 --> 00:21:21.504

However,

 

00:21:21.524 --> 00:21:23.524

there's one thing that in all of it,

 

00:21:23.684 --> 00:21:24.945

I think really does matter,

 

00:21:25.145 --> 00:21:26.305

which is networking.

 

00:21:27.745 --> 00:21:29.445

networking, finding people,

 

00:21:29.946 --> 00:21:31.006

even if you're mostly,

 

00:21:31.146 --> 00:21:32.086

I'm a huge introvert,

 

00:21:32.166 --> 00:21:33.746

even if you're a really big introvert,

 

00:21:33.886 --> 00:21:35.327

finding your people that

 

00:21:35.367 --> 00:21:36.847

you can talk to and finding

 

00:21:36.867 --> 00:21:38.187

the organizations and the

 

00:21:38.227 --> 00:21:39.568

groups that you feel comfortable with,

 

00:21:39.848 --> 00:21:40.588

they're going to A,

 

00:21:40.688 --> 00:21:41.768

help you really advance

 

00:21:41.868 --> 00:21:44.249

because they know things you don't know.

 

00:21:44.309 --> 00:21:45.509

So that's like, how do you learn?

 

00:21:45.529 --> 00:21:46.569

Sometimes it's by talking to

 

00:21:46.609 --> 00:21:47.229

other smart people.

 

00:21:47.510 --> 00:21:48.250

The other thing that will

 

00:21:48.290 --> 00:21:49.610

really help you advance is

 

00:21:49.690 --> 00:21:51.670

that a lot of times in these groups,

 

00:21:52.011 --> 00:21:53.851

they know someone who's hiring somewhere.

 

00:21:54.571 --> 00:21:55.572

And that can help you get

 

00:21:55.592 --> 00:21:57.154

that job or the next job,

 

00:21:57.294 --> 00:21:58.855

or at least understand what

 

00:21:58.895 --> 00:22:00.096

the skills you need to get

 

00:22:00.136 --> 00:22:00.937

that job that you want.

 

00:22:00.977 --> 00:22:01.918

So then you can create and

 

00:22:01.958 --> 00:22:03.720

craft your own plan for advancement.

 

00:22:04.781 --> 00:22:07.363

Yeah, I completely agree with that.

 

00:22:07.523 --> 00:22:08.624

Like that has been, you know,

 

00:22:08.704 --> 00:22:10.926

definitely helpful for me in my path.

 

00:22:10.966 --> 00:22:11.727

And then also just

 

00:22:11.987 --> 00:22:13.189

understanding what

 

00:22:13.249 --> 00:22:14.370

fulfillment means to you.

 

00:22:15.351 --> 00:22:16.712

in selecting roles and

 

00:22:16.732 --> 00:22:18.033

different projects to work on.

 

00:22:18.073 --> 00:22:18.814

I think, you know,

 

00:22:18.854 --> 00:22:20.495

it helps because if you

 

00:22:20.595 --> 00:22:21.957

don't know what fulfills you,

 

00:22:21.977 --> 00:22:23.438

you don't know what your values are,

 

00:22:23.618 --> 00:22:24.719

you could get into a role

 

00:22:24.759 --> 00:22:25.800

and you can make tons of money,

 

00:22:25.840 --> 00:22:26.921

but it'll feel like a drag

 

00:22:27.041 --> 00:22:27.742

every single day.

 

00:22:28.322 --> 00:22:30.264

So, I mean, you gotta, you know, find it,

 

00:22:30.484 --> 00:22:32.987

what it is that you love and drives you.

 

00:22:33.127 --> 00:22:34.168

So, yeah.

 

00:22:34.958 --> 00:22:37.519

Diana, thank you for your time today.

 

00:22:37.539 --> 00:22:39.559

I just wanted to say I

 

00:22:39.599 --> 00:22:40.920

really appreciate your time

 

00:22:41.100 --> 00:22:42.220

and thank you for coming on

 

00:22:42.340 --> 00:22:43.800

as my first guest for the

 

00:22:44.281 --> 00:22:46.081

OnCyber AI podcast.

 

00:22:46.181 --> 00:22:50.442

We love you here at ITSP.

 

00:22:50.582 --> 00:22:51.022

So yes,

 

00:22:51.142 --> 00:22:53.443

thank you so much and thank you to

 

00:22:53.603 --> 00:22:54.503

everybody tuning in.

 

00:22:55.624 --> 00:22:56.064

Thanks.

 

00:22:56.804 --> 00:22:57.124

Bye