Helen Oakley and Dmitry Raidman reveal how Software and AI Bills of Materials (SBOMs and AISBOMs) are transforming supply chain transparency from a compliance checkbox into a business advantage. In this episode, they unveil a new open-source tool and explain why understanding what’s inside your AI and software stack is critical for trust, security, and operational clarity.
Helen Oakley, Senior Director of Product Security at SAP, and Dmitry Raidman, Co-founder and CTO of Cybeats, joined us live at the RSAC Conference to bring clarity to one of the most urgent topics in cybersecurity: transparency in the software and AI supply chain. Their message is direct—organizations not only need to understand what’s in their software, they need to understand the origin, integrity, and impact of those components, especially as artificial intelligence becomes more deeply integrated into business operations.
SBOMs Are Not Optional Anymore
Software Bills of Materials (SBOMs) have long been a recommended best practice, but they’re now reaching a point of necessity. As Dmitry noted, organizations are increasingly requiring SBOMs before making purchase decisions—“If you’re not going to give me an SBOM, I’m not going to buy your product.” With regulatory pressure mounting through frameworks like the EU Cyber Resilience Act (CRA), the demand for transparency is being driven not just by compliance, but by real operational value. Companies adopting SBOMs are seeing tangible returns—saving hundreds of hours on risk analysis and response, while also improving internal visibility.
Bringing AI into the SBOM Fold
But what happens when the software includes AI models, data pipelines, and autonomous agents? Helen and Dmitry are leading a community-driven initiative to create AI-specific SBOMs—referred to as AI SBOMs or AISBOMs—to capture critical metadata beyond just the code. This includes model architectures, training data, energy consumption, and more. These elements are vital for risk management, especially when organizations may be unknowingly deploying models with embedded vulnerabilities or opaque dependencies.
A Tool for the Community, Built by the Community
In an important milestone for the industry, Helen and Dmitry also introduced the first open source tool capable of generating CycloneDX-formatted AISBOMs for models hosted on Hugging Face. This practical step bridges the gap between standards and implementation—helping organizations move from theoretical compliance to actionable insight. The community’s response has been overwhelmingly positive, signaling a clear demand for tools that turn complexity into clarity.
Why Security Leaders Should Pay Attention
The real value of an SBOM—whether for software or AI—is not just external compliance. It’s about knowing what you have, recognizing your crown jewels, and understanding where your risks lie. As AI compounds existing vulnerabilities and introduces new ones, starting with transparency is no longer a suggestion—it’s a strategic necessity.
Want to see how this all fits together? Hear it directly from Helen and Dmitry in this episode.
___________
Guests:
Helen Oakley, Senior Director of Product Security at SAP | https://www.linkedin.com/in/helen-oakley/
Dmitry Raidman, Co-founder and CTO of Cybeats | https://www.linkedin.com/in/draidman/
Hosts:
Sean Martin, Co-Founder at ITSPmagazine | Website: https://www.seanmartin.com
Marco Ciappelli, Co-Founder at ITSPmagazine | Website: https://www.marcociappelli.com
___________
Episode Sponsors
ThreatLocker: https://itspm.ag/threatlocker-r974
Akamai: https://itspm.ag/akamailbwc
BlackCloak: https://itspm.ag/itspbcweb
SandboxAQ: https://itspm.ag/sandboxaq-j2en
Archer: https://itspm.ag/rsaarchweb
Dropzone AI: https://itspm.ag/dropzoneai-641
ISACA: https://itspm.ag/isaca-96808
ObjectFirst: https://itspm.ag/object-first-2gjl
Edera: https://itspm.ag/edera-434868
___________
Resources
LinkedIn Post with Links: https://www.linkedin.com/posts/helen-oakley_ai-sbom-aisbom-activity-7323123172852015106-TJea
An open letter to third-party suppliers: https://www.jpmorgan.com/technology/technology-blog/open-letter-to-our-suppliers
Learn more and catch more stories from RSA Conference 2025 coverage: https://www.itspmagazine.com/rsa-conference-usa-2025-rsac-san-francisco-usa-cybersecurity-event-infosec-conference-coverage
______________________
KEYWORDS
helen oakley, dmitry raidman, sean martin, rsac 2025, sbom, aisbom, ai security, software supply chain, transparency, open source, event coverage, on location, conference
______________________
Catch all of our event coverage: https://www.itspmagazine.com/technology-and-cybersecurity-conference-coverage
Want to tell your Brand Story Briefing as part of our event coverage? Learn More 👉 https://itspm.ag/evtcovbrf
Want Sean and Marco to be part of your event or conference? Let Us Know 👉 https://www.itspmagazine.com/contact-us
Building Trust Through AI and Software Transparency: The Real Value of SBOMs and AISBOMs | An RSAC Conference 2025 Conversation with Helen Oakley and Dmitry Raidman | On Location Coverage with Sean Martin and Marco Ciappelli
Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording, as errors may exist. At this time, we provide it “as it is,” and we hope it can be helpful for our audience.
_________________________________________
Sean Martin: [00:00:00] and here we are. We're in San Francisco. We're not in Toronto. You, you made it to San Francisco, RSAC conference. How's it been so far?
Helen Oakley: It's been amazing. Yeah. Very busy time as always. Uh, I know San Francisco,
Sean Martin: 40, 40 plus thousand of our closest friends.
Helen Oakley: Oh, yes.
Sean Martin: And Dmitri, it's a pleasure to meet you the first time, uh, to connect in person.
Pleasure is mine. Yes. Thanks. Thanks for being here. And Helen Oakley, good to have you on again. Yeah, it's always a pleasure.
Helen Oakley: It's been a little bit, it's been a
Sean Martin: little, what was it? Uh, October? Yeah, the SEC I think was the last time. We had a chat, but, um, you've been busy de of you Dmitri working on some new things.
We all busy. So we're gonna talk about some of that and, uh, obviously some of the things that, that you've been discussing broader speaking at, at, uh, the conference here. Um, for those who don't know you, maybe a few words about what you're up to. Maybe some of the organizations [00:01:00] you're, you're part of, some of the, the speaking circuit that you, uh, that you.
Your journeys around the world, sharing, sharing what you're working on. Helen, we'll start with you folks. Yes. Folks may have seen you on the show before, but just give 'em a little refresher.
Helen Oakley: Yeah. I'm a senior director at SAP, uh, on product security team, and we en engage in different, um, initiatives and we establish security across SAP software.
But outside of SP for fun, I do
Sean Martin: for fun. Yes. Yes. I do a
Helen Oakley: lot of contributions, uh, on different, um, organizations like ova, ent threats and mitigation. Um, I co lead with Mitri, um, as well, um, the AI bomb ai sbo, uh, initiative, uh, under CCS bomb, uh, working groups, target teams. And, um, of course, um, we also have some other initiatives going on that we are working on, like, um, AI integrity and Safe Use Foundation and also building some useful tools.
Sean Martin: Yes, we'll touch on one of the tools, [00:02:00] at least, um, Dmitri. What, what, what are you up to?
Dmitry Raidman: Eh, so my most of my day being spent at, uh, being a co-founder and CTO at a company called sbe. SBE is a company that, uh, builds a platform to manage SBOs and provides software, supply chain security.
Sean Martin: So the broader bum, the brother Asbo.
The
Dmitry Raidman: big, yeah, the big, the big word. Yes. Just this morning, uh, we hear from. The government that ASBOs becoming more and more important. And I'm working with a lot of customers in the Fortune 500 area, and we're talking about these bombs or eggs, bombs, right? Uh, we, we, I, I hear things like, Hey, if you're not gonna gimme an asbo, I'm not going to buy a product X from you.
Sean Martin: Oh, we are hearing that already? Yeah. Okay. I wonder if we, I was wondering if we'd get to that point where, uh. Yeah, there'd be requirements
Helen Oakley: at some point. Yes. [00:03:00] Because the UCRA is coming up. Right. And uh, that will will be an requirement.
Sean Martin: Yeah. We didn't touch on that yet together. I think I had somebody else on for the CRA, but, uh, sounds like you might be on the show again 'cause we're not gonna touch on that today.
We only have 15, 20 minutes, but, so let's, um, who, who wants to maybe Dimitri, you start the big, the big state of the union, if you will, for SBO m. Maybe an overview from that and then maybe a deeper dive into the AI space there. And I don't know if we, if Agentic AI is a part of that and I know we'll have to talk, so maybe to me kinda the big picture first and then we'll get into the Yeah.
Ai, the big spell out SBO for folks. Yeah, of course. We're not just using acronyms here for the first round,
Dmitry Raidman: so definitely BUM stands for software bill of materials, and I would say that that's one of the hottest. Areas now in the space of supply chain security, going to a lot of, uh, [00:04:00] regulations and required by many organizations to achieve transparency.
Basically, people want to know what they buy and what inside the black box, and it's quite simple concept because we are get used to it from buying, uh, products at the grocery shop. If you look at any product of this past package, right, we can see. Turn it around and see that's the ingredients. There's five ingredients, right?
And they all listed there. We why We are not doing it about software or devices that's running software in them when we procure these devices. So that's
Sean Martin: how important is it that we don't not just know what's in it, but also where it's from. 'cause I can think of, obviously we have products at the store we can buy.
When you go to a restaurant. Sometimes they actually present as a, as a value that you know, where the cheese is sourced from and the meat is sourced from, and the wheat or the whatever the vegetables are sourced from. Mm-hmm. How, how important is the where in what we're talking about here? [00:05:00]
Dmitry Raidman: Well, this, this year it became more important because of the is, but let's not get to that point.
Maybe a little triggers we won't go
Sean Martin: down there. Yeah. But,
Dmitry Raidman: but it's, it's quite important, uh, because people are looking at, uh, not just, uh, where the product coming for. From, but also when you building the product and you making the design and you making the thinking of what goes into your product ingredients and components, they want you to think, okay, who's contributing to this?
Uh, open sources. Because if you looking at the statistics, between 70 to 90% of any software based product is open source, and this open source could come from different places. I mean, we recently have seen the attempt. The almost successful attempt of, um, injecting a malicious, uh, I would say code into Xed Library that had been used by every single Linux operating system across the world.
And luckily it was averted, but [00:06:00]
Sean Martin: yeah, I didn't wanna go there, but I'm gonna do it anyway. They're not talking politics here, but sometimes there are things that drive action. More than morals and, and ethics and money is often one of those drivers do. Is there an opportunity perhaps to actually get a solid SBO m infrastructure in place?
Purely because companies need to know how, where stuff's coming from and if it's going to cost them more or less. Can we, is is there an option to use that lever, do you think?
Dmitry Raidman: I, I think that this lever exists and, uh, what we are seeing among our customers is that when they're adopting the ASBOs, it's not just because they have to do that.
They finding out that there is a lot of ROI that comes beyond the security, the beyond, beyond the, Hey, I'm compliant, I have asthma. [00:07:00] Right. Okay. I, I mean, one example from one of our customers is they have saved. Hundreds of hours per every open source project that they're running just by adopting SBOs and then using platform that can handle the SBOs, monitor them continuously, and also provide them alerts about new risks, vulnerabilities, et cetera.
Sean Martin: So how, how are things similar or different when we start talking about AI and maybe, maybe also the ENT ai. Because I mean, there's a lot of open source, there's a lot of custom built stuff. I mean, paint that picture for us, huh?
Helen Oakley: Yeah, so we are looking at, um, also dependencies, but it's a different type of dependencies In traditional software dependencies.
We have open source in ai. We also have open source type of frameworks, and we have underlying architectures and we have data sets. So it becoming a whole different [00:08:00] paradigm in additional to traditional software. And that's why we are calling. AI as bomb to also known as AI bombs, artificial intelligence software bill of materials to make sure that we collect, in addition to traditional software, those AI relevant metadata that we collect about models, about data sets and all the ecosystem there.
Sean Martin: Yeah. 'cause it, it'd be easy to forget that the model, I guess different versions of the model, right? Mm-hmm. And also, uh, is part of that, the. The data it was trained on, perhaps not just what it's actually being used in production. I, is that part of the picture
Helen Oakley: information about training is part of, uh, artificial intelligence bill of materials, definitely.
Okay. Um, and even information about energy consumption? Yes. So we have a lot of opportunities to collect different types of data. And you mentioned agenda of course, because agents, so they use, um, LLMs and models in the background. So it'll [00:09:00] have also AI bombs, um, or a IS bombs as part of the whole transparency.
Sean Martin: Okay. And forgive me if I didn't hear it, but the dropping the s from the a IS bomb is what be because of the hardware piece of that's running this or what's the purpose of that?
Helen Oakley: No, um, it's a good question because, um, traditionally if you look at standards mm-hmm. SPDX and Cyclone dx. Traditionally, we call them AI bomb or ML bomb, similar to hbo, right?
So C bomb, like a use cases of an sbo. But the thing is, it becomes very confusing to a certain extent because people start thinking that this is a different thing from the sbo, but it's not. And this is why we did a little rebranding in the beginning of this year, uh, to make it obvious. So we are working on this use case document for a IS bomb.
Got it. And we want to make sure that industry and [00:10:00] different types of roles across industry understand, uh, the language and the language is consistent. Right.
Sean Martin: Got it. All right. So you have a language, it's consistent. You need to have the conversation. Mm-hmm. And, and the tool. To actually enact and do something with the, the understanding that you have and the conversation you have.
So you presented here at RSAC conference. Mm-hmm. Um, and you also presented as part of your presentation, I believe you, you, uh, announced a new tool, that open source tool that you guys put together, right? Yes. So tell tell us a little bit about the session and, and how the tool is part of that.
Helen Oakley: Yeah. So yesterday we run our, uh, third, uh, AI as bomb workshop.
Uh, second that the essay, uh, third, so year ago we ran the first one, and this is where we initiated the effort to write use cases of, of AI as bomb. We almost done the document that will be released out of [00:11:00] csun, uh, documentation. So it's just going through final reviews and, you know, just maybe a little bit time for publication, but it's coming out there for general use and as part of the, uh, exciting news.
So this is really first on the market open source tool that can. Generate as bomb in Cyclone Dx format for model on ha Hagen phase. So if you step back a little bit, um, why is so important? Because there's a lot of complexity in generating this kind of format. It requires different tools and different environments collecting this metadata.
And generally it should be developed together with the model and provided by the providers of the model. Okay. But because we have, um, you know, like Alan Friedman, uh, yesterday said, we are building the plane as we're flying it,
Sean Martin: right?
Helen Oakley: Yeah. And it's exactly what is happening right now. We are building all the AI systems and we trying, uh, to also engage, uh, in [00:12:00] different initiatives to make sure that we build security from the start.
Um, and this is where we also need to provide the tool for industry to help generate and. A lot of, uh, organizations are using models from hack face. So if we have the tool already generating this format for them, then they will be able to really improve their transparency, already use it. 'cause that's majority chunk of usage.
Sean Martin: Yeah. Visibility and transparency builds trust, hopefully.
Helen Oakley: Yeah. Do you have something to add, perhaps?
Dmitry Raidman: Well, I can say that since we kicked off this, uh, work stream. As part of the CSA Tiger teams mm-hmm. And worked on the IBOs. I'm continuously bombarded by people on LinkedIn or people who meet me and saying, Hey, we know how to generate ASBOs for our software.
We do it already. Mm-hmm. And, and really like whoever it was important for them to generate this asbo. They already have done it. They either bought some solution [00:13:00] or took some open source, but how do we create AI ASBOs? Like what's the way, how we structure them? What kind of fields and these fields are already defined, right?
I mean, so the standards, the two standards we are following today, which are SP DS and ics, they already have the support, but there was nothing there out there to create these SBOs for this AI components. And we decided to pick up on that effort. Mm-hmm. And that's the
Sean Martin: result. Yeah. Yeah. What was some of the feedback from the session?
Helen Oakley: Um, honestly, I'm overwhelmed. Like even yesterday I see a ton of posts already on, on social and today, uh, people just meeting on the expo floor and just congratulating and saying like, it's really useful. They've been waiting for something like that. So I'm really happy to hear that because, um, our goal is to really contribute back to community and provide this tool to help organizations in transparency.
Sean Martin: Yeah, that's why I appreciate you so much. You do, [00:14:00] you do a lot. I'm just meeting you, but it sounds like you're very involved as well. Mm-hmm. And giving back to, uh, the community. Super cool. So that's why I love having you both on. But, um, let's, you're also speaking on, is it Wednesday? Tomorrow? Wednesday, right?
Yes. Yes.
Helen Oakley: The days are flying by. So,
Sean Martin: so tell us a little bit about that, that session. This will be released after you talk. Mm-hmm. So we're not giving away any secrets, but, uh, what, what are we talking about there?
Helen Oakley: There is no secret. We already released a, a paper, uh, a guideline you out of Ava Gen AI project, um, uh, which is, um, already known for top 10 lms.
Yes. Um, yeah. So out of that we released a very specific guide for threats and mitigations of agenda ai. So, uh, several extinguished experts have written that with the community effort, collaboration and reviews and contributions. Uh, in February, the first version was released. We are already working on updating, um, and, uh, [00:15:00] planning for the next update with, um, more information.
And there are different work streams that working on pieces of vulnerable code. Examples as well for agents, how to implement agents in a secure manner. So supply chain, of course, is one of the topics that we are also going to collaborate, uh, incorporate, uh, tomorrow at the session. Um, there is, um, um. AI Summit over AI Summit.
Right. And, um, people will join us. I don't
Sean Martin: your intention to go if it didn't work out timing wise, but Yeah. So could be a good
Helen Oakley: Yeah. And we will run some sessions already. Uh, about, um, you know, what we are planning to do, we are actually gonna do, um, um, a small mini threat modeling of MCP model context protocol.
I'm, I'm sure you heard about that one. Um, uh, based on our agenda, AI threats and mitigation. Yes. So that's gonna be fun.
Sean Martin: A lot, a lot of activity. Let's, um, we have a couple minutes here. Okay. I want to, [00:16:00] so a lot of, obviously people watching this are connected to RSAC conference. They're trying to wanna see what's going on here.
We're hope, hopefully bringing some news with this. Um, but my audience specifically are security leaders, CSOs, CSOs, some of my CIOs and CTOs as well, who clearly have a hand in. Building products, building infrastructure, getting the business to run, hopefully doing it safely. Any, any words of advice to them given the tool you've released, the work you've done for SBOs and a IS bombs?
Um, maybe something to share with them to kind of help them wrap their head around what's going on here.
Dmitry Raidman: Let's start with you. Of course. Uh, so one of my advices would be, um. When you are up to adopting something like SBOs AI bombs, CBOs, which is cryptography, age bombs, which is hardware, right. [00:17:00] And whatever bombs will come.
And together with that sushi
Sean Martin: bums I heard earlier. Yeah. Sushi bombs. Yes. I'm always about the food. So absolutely.
Dmitry Raidman: I think one of the things that, uh, I always advise to look at is that Ben, internal benefits and internal value that. These companies will get from this transparency that is not only intended to be reflected out to their customers, but first they're becoming transparent to themselves.
They're now aware of the things they can recognize. Suddenly, out of all of this noise, where are their crown jewels? And then they can recognize what are the low hanging foods that they need to address. And that's amazing once we learn these things. We can add to them.
Sean Martin: Yeah, it could be easy to get overwhelmed and not start right.
Mm-hmm. Yeah, Helen,
Helen Oakley: and, you know, ai, um, AI agents, agenda, AI and [00:18:00] general AI amplifying risks that we already have, right? Right. Um, somehow we are forgetting the fundamental, um, implementation controls, like, you know, authorizations or so, and then on top of that we have. Additional autonomy of, of the system.
So things like transparency, ai, s bomb will definitely provide, um, visibility into those risks, uh, from underlying dependencies. Yeah. There are a lot of examples where there is remote code, execution is embedded as a feature. Yeah. Right. And, um, things like that, we, we need to be aware and as leaders, right.
And of course, um, have risk assessment of all the AI components that you do and that you adopt in your organization. Um, it's very important to have that.
Sean Martin: Yeah.
Helen Oakley: And by the way, um, since I have you here,
Sean Martin: ah,
Helen Oakley: I have a gift for you. Is it a new one? Yeah.
Sean Martin: Ooh, look at that. It's a AI bomb
Helen Oakley: sticker.
Sean Martin: Oh. The AI bombs.
And
Helen Oakley: you can find, and
Sean Martin: I have the other one I stole from sector. I carry that around. So thank [00:19:00] you for that. Love it. So if you don't start, you won't know. And if you don't know, you can't take action. Yeah. Simple as that. Some of the best practices start, start small. Helen Dmitri, thank you both. Keep doing the good work for the community.
Thanks for sharing the tool with everybody. Thanks for sharing this time with me. Thanks everybody for joining us here at RSAC conference. Lots more coming to you over the next few days. Its me magazine.com/rsac 25. Stay tuned, subscribe, share with your friends and enemies. We'll see you soon.