A Zero Trust Leadership Podcast
.webp)
The Math Doesn’t Math: Why a Former White House CIO Says Cybersecurity Is Broken | Theresa Payton
Recorded live at RSA Conference 2026, Theresa Payton—the first female White House CIO joins to tackle a critical question: If organizations are spending more than ever on cybersecurity, why are outcomes getting worse?
Transcript
Raghu N 00:00
Welcome to another episode of The Segment, this time in a first for the segment recorded at RSAC 2026 so it's a big step for this little podcast of ours. So today, I am so excited to be joined by Theresa Payton, who made history as the first female White House Chief Information Officer, where she was responsible for protecting some of the most sensitive systems in the world. Since then, she's advised Fortune 500 boards, CEOs, and technology leaders on how to navigate risk in an increasingly complex digital landscape. She's also the CEO of Fortalice Solutions and a leading voice on AI privacy and security, especially when it comes to how emerging technologies are reshaping the threat landscape. In this conversation, we're going to explore a big question, if we're spending more than ever on cybersecurity, why do the outcomes keep getting worse and what needs to change, starting with how we think about people, systems and trust in the age of AI, Theresa, welcome to The Segment!
Theresa Payton 01:03
Oh, thank you. I've been really looking forward to this, and thanks for my cup of coffee.
Raghu N 01:07
No problem always. I mean, it's the least we can do, right? So thanks so much for being here. So incredible background, Chief Information Officer at the White House. Give us a bit more information on how you got there, and maybe some fun anecdotes from that role?
Theresa Payton 01:22
Yeah, sure, absolutely. First of all, what an incredible honor to be able to serve the country that way, because I come from a long line of US military, grandfather, father, uncles, actually, aunties, who all served in the U.S. military. And so I'm so glad I had the opportunity to serve in in this capacity. You know, I It's interesting because I started off in financial services industry and banking, and I always say My job was to go from the server room to the board room. So I started off as a developer, manage big teams of developers. But I was always raising my hand and saying, Hey, I could do more, or I'll volunteer for that, or I don't know how to do that, but I'll figure it out. And so I actually found myself starting to brief the board of Barnett Bank, which is now part of Bank of America at age 23 and I look back now and I think, gosh, if 23 year old me had known like, how high the stakes are briefing, I was just like, Yeah, of course, I'm briefing the board. But, and it's not because that I was more gifted than my colleagues, because I had incredible colleagues to work with it at Barnett bank, but it was really just because I kept raising my hand and saying yes, and this is important, and I want to work on these cutting edge projects to make banking seamless, elegant, awesome to use for our customers, while at the same time fighting criminals. And we didn't even call them cyber criminals back then. It was fraudsters and, you know, the check washing, and all the things that were going on back then, which still actually exists today.
Raghu N 02:45
That's amazing, actually, just before we go further and sort of ultimately, sort of how you got to your role at the White House, I think that experience about someone coming into a role, and despite, in inverted commas, being fairly Junior, taking those opportunities, because today we hear so much about lack of opportunity in cybersecurity and sort of the huge sort of disparity between, sort of the number of open vacancies and people available to fill them. Can you talk a bit more and provide some sort of experience and hope about how those coming into the trade can really essentially put that forward and get those opportunities?
Theresa Payton 03:20
Yeah, I love this question so much well. So for starters, if you're a hiring manager, really read those job descriptions, because I read them and they're soul crushing. I mean, it's like, do you have all these certifications? Do you have all this experience? It better be this. It better be that. It better not be this. And these are, like, the minimum qualifications. Why? Why are you requiring all that? Is that getting you what you need? Like, did the regulators make you require that? So really reimagine like, what is the mission set? What are you trying to accomplish? What is the noble cause? What is your team fighting for? Like, what? What makes everybody jump out of bed every day and be really energized about coming to work for you? So rewrite those job descriptions. Also be thinking about transferable skills. So if somebody is awesome at fraud, they will also be awesome at cybersecurity. If someone is really good at communications, gosh, we could use people who are really good at communications and cybersecurity. So hiring managers reimagine what you're looking for. What I would say to people who are struggling to get in or struggling to get their next role. Don't overlook what your community needs on a volunteer basis. So if people are telling you need this skill and you don't have it, then what I would say is, go to all the things you're passionate about in your community and say to them, Hey, this is what I do for a living. I'd like to volunteer. You're not going to get turned away, and you're going to get real, transferable experience that you can take into applying for that next dream job that you want.
Raghu N 04:44
You said awesome, but fraud. Did you mean awesome at the act of fraud or detecting?
Theresa Payton 04:51
Well, I mean, hey, just make sure you're a force for good. Okay, take us to the White House. Yes, I had this amazing opportunity to work for President George W. Bush, second half the second term. So that's 2006 to 2008 and you know, that sounds like a long time ago, because it was, it was really a long time ago, but there's a lot of things that haven't changed and so, so, for example, the first ever iPhone came out in 2007 so what an incredible time to be at the White House. President George W. Bush focused on a couple of key things. One, always leaving things better than we found it, and he wanted it to be sustainable for multiple presidencies, not just the next people showing up. The other thing is, he really wanted to focus on having lots of different ideas and lots of different backgrounds. That's why they came looking for me. I never worked on government things. I didn't work on any campaigns, but they found, you kind of, the work in banking, appealing and transferable. And then, you know, the last thing that I really took away from my time there, because when I worked in banking, if bad things happened on sort of the fraud or the crime side, I would think to myself, gosh, if we could just get to our customers and tell them a little bit about what we know, you would be safer. My job would be a little easier. When I got to the White House, I realized it's shifting every day, what the threats are targeting the White House based on what's going on around the world, and I needed to change my mindset, and I think that's the mindset that needs to change now for the cybersecurity industry. And so I really doubled down on focusing on the human user story and designing for the human during my time at the White House, we were experimenting with a lot of the technologies that you're seeing commercially available today, and so this, I think, kind of these time tested principles of designing for the human user story, even if you're using stick figures on the whiteboard, is incredibly important. Then you could do the architecture behind that quick, funny story. So you were asking me for, like, an anecdote that I could actually share live. And a lot of people don't realize, you know, the President really loved sharing music with his daughters, and he had an iPod Shuffle, and so, you know, oftentimes he would download music, and then he would listen to music while he was getting his chainsaw out on the ranch in Crawford, and, you know, and he would get other people on the team out there clearing brush the ranch Crawford, right? So Well, at that time, the way iPod Shuffles were designed was, it was a feature, not a bug. As soon as you updated your playlist, it would broadcast it publicly to everybody, so that you could find other like minded people, so that you could share more music and not a great look to have the President's playlist being shared publicly. So I won't say what we did and how we did it, but we were able to find a way to let him uniquely be him and be a dad to his daughters, and to create these safety nets around him that were pretty much invisible to him. We, you know, I couldn't say every time you update your playlist, I need to be present, or my team needs to be present. And so we came up with a strategy uniquely around him, to protect him, to allow him to live his life, both as president united states and as a dad and a husband, and protect his playlist, because things that might seem sort of mundane, like, what's his favorite toothpaste brand? What does, what's his shampoo, what's his playlist? Those are all things that are digital pieces of a puzzle that could put him at risk. And so having to think about those things really shapes my thinking today around how we think about protection strategies, because it's not just systems and enterprises and data, although that's incredibly important. At the end of the day, I love technology, but I love humankind even more, and so we have to protect the humans Absolutely.
Raghu N 08:24
And I think what's also from that little anecdote you shared, what's also so important is that even at sort of the most important levels, from a security perspective, it's securing the most basic, foundational things that are so important. And I think nothing could be further, sort of nothing be closer to the truth when we look at data systems, et cetera, right? So foundational things that are so important. So one last thing, so of course, you were there sort of the in the late, sort of 2000s and in the last sort of five, six years, so much cybersecurity, like requirements, regulation guidance has come out, and specifically from the U.S. government. When you when you look back, is that something that you predicted and felt that this was coming, or was that really kind of a tsunami that just picked up speed in the late sort of 2010
Theresa Payton 09:13
I think it's a little bit of both. And I believe that security frameworks and regulations are incredibly well intentioned, but might be the worst thing that ever happened to us. And the reason why I say that is, is we sort of go into this mentality of, I've got all of these checklists, and as long as I have a product that fills the checklist on the on the, you know, I've got my little kind of board here, and I'm just trying to check things off. So my regulators are happy. My board is happy. My CEO is happy. And at the end of the day, cyber criminals and fraudsters, they know we have these regulatory frameworks too. So what they basically do is say, okay, how would I reverse engineer? How do I mean, they could teach a master class in human behaviors. So it's like, how do I reverse engineer? That? Things that are missing, and we were spending an inordinate amount of money. I actually wrote it down because the math doesn't math, and I wanted to make sure I got it right for this podcast. So Gartner says we are all going to spend 240 billion. That's B billion US dollars this year. But cyber security venture says that cybercrime losses will hit 10.5 trillion. I'm going to ask you a question. Do you think if I went into President George W Bush or the CEO of any of the banks that I work for, and I said, Hey, boss, if you give me two $40 billion this year, I promise to only lose 10 point 5 trillion this year, that I would get approval and keep my job. Do you think it's a pretty poor ROI? I think it is the worst math. It doesn't math example that I can think of, and it's what tells me that, through no fault of our own. You know, I love my cybersecurity colleagues, I love my technology colleagues, but we are stretched thin, and we're at the bottom of the waterfall. And it is time to reimagine how we think about this, and it's time to reimagine how we spend our money, where we spend our money, and what our focus is on, because AI and other technologies like quantum, which is right around the corner, in my opinion, are changing everything about our playbooks. And I would recommend to both tech and cybersecurity that we have to get out of the way. And I would recommend that we bring in designers who are known for customer esthetics. So think about the Rosewood Hotel Group, the Ritz Carlton. Think about Disney. Think about companies that are well known global brands who are always thinking about a sense of place and design and esthetics, and we need to let them redesign what the customer experience is like, and the math doesn't math anymore. And so this is, this is our moment, actually, as technology execs, as cybersecurity practitioners. This is our unique moment to say, You know what? With AI here, we need a sea change. So sure, we're still going to do the compliance checklist because the regular because we need to, it's being a good steward of what we're being told to protect. But it's time to also reimagine the bigger picture.
Raghu N 12:10
Absolutely so. And I think that that point you made eloquently about compliance, it really we should accept that it's, it's not the ceiling, it's the floor. Yes, it's, if you do that, you are doing the very bare minimum. And it is your responsibility as an organization, as a professional, to say, How can I exceed this without, of course, like compromising the user experience, right? Because that's where we get the real value. It's not doing the bare minimum. So actually, just dwelling on this point for a bit longer. Why do you think we have this challenge. Like, why do we think we've accepted that actually doing the bare minimum is enough? Like, what is the cause of that? Because the numbers don't add up. As you said, right? The numbers just don't add up. Doesn't make sense. You can't put that in front of someone say, invest more, because they'll say, Well, look, the losses are just piling up. Why should I?
Theresa Payton 12:56
I believe we? Everybody has good intentions here, and it's sort of a you have to hit you have to hit these compliance frameworks. You have to hit it so you have to hit a certain level of maturity. And then you've got outside firms that, for at least for publicly traded companies, are getting scored, and the scores are published publicly. And so all of these things, again, are well intentioned. And when you look at the cybersecurity budgets, when you look at the staffing, when you look at all the products that have to be managed, it just ends up being more to be done and less bandwidth, less mental shelf space can be dedicated to the reimagining. So for many of the security teams, they can barely keep up on the bare minimum. So I think that's why we find ourselves where we are. And again, product companies, they're not the problem either. They're very well intentioned. So it's like, okay, I see this need. Everybody tells me this is a part on this compliance regulatory checklist they can't get to. This is I'm going to build a thing to fix for that thing. But again, I would challenge us. I think we all need to walk away from what we're doing and say we need a sea change. We need to think differently, just like when the in 2007 when the iPhone came out, first phone without a keyboard, first phone that gave you an Internet browser in your pocket, completely reimagining how we think about phones, phones with cameras. It's time, and we have to do that, and we can't afford not to. And I know, for anybody listening to this, saying, you know, I can barely keep up with what I have on my plate today, I would just ask, can you carve out 15 to 30 minutes to just a day just to think about, how would you reimagine this if you could get a shot
Raghu N 14:47
at it? You've used the word reimagining a few times already, and I think it's such a beautiful word. But as you've just said that the challenge is, is that the burden of keeping up with the bare minimum. Minimum is such that just the opportunity to reimagine is few and far between. So just take us a bit deeper in Sure, into how we make that reimagining real, so that we are able to significantly improve cyber security, but make it exciting and like I think, going back to what you gave the iPhone example about creating an exceptional user question so that everyone is saying, I want to do I want to do more of this rather than what's the least I need to do to kind of move forward?
Theresa Payton 15:31
Yeah. I mean, I think this is going to sound really old school and basic, but it really works. Do site visits, whoever your technology users are ask for permission to shadow. What's going on. Sit in your call centers now. Do not mistake me saying, do something you're observing. Don't sit in your sock. Listen, sit in your call center. Listen, sit at client sites, listen, observe. Do not do anything. Do not take over. You know, just kind of fade into the background. You will learn so much about what is not working. And then, if you get an opportunity, customer focus groups are so huge, the banks are stellar at doing customer focus groups. We did the same thing at the White House. I would so there's 13 components that make up the executive office, President, and then you have the First Lady's office, the president's office, you have the cabinet members, and then you have the Vice President's office. You have a lot of a lot of people with their own stakeholder groups. And I did a lot of listening, and in listening, that's where I saw something that we were super proud of on my team got in the way of mission. So you really have to understand mission. Why is the user interacting with technology? What are they multitasking and doing before they interacted with technology? What are they multitasking and doing after they left the technology? All of these things will tell you what is missing in your safety nets and what is not working, and all of the workarounds that is the basis for reimagining the storyboard. Again, I feel
Raghu N 17:04
this is such a such a basic need. Like focus groups have been part of like product design for generations now, but now, when I think about it from the perspective of, let's say, security vendors and security products, we don't spend enough time asking our end users, like, how do they use it? How do they interact with it? Why is that discipline gone from cyber specifically?
Theresa Payton 17:27
I don't, I don't know if it ever fully existed. So I'll give you an example from my time in banking. So being on the CIO side of things, but at a time in banking where fraud and security still reportedly, I had responsibility on my P and L for fraud and security. So even though those teams might have reported to somebody, they dotted line. Worked on my team, and at the time, I will tell you any product vendors that we interacted with, they saw two customers. They saw the bank as a customer, my and my customers. On the business side, their customers that the bank's customers as another customer. And so they were constantly doing focus groups with the banks as primary customer, doing implementation and banking customers. I never saw as much of that happen on the fraud tool side and on the cybersecurity tool side, they basically saw the purchaser as the customer, and not the end user necessarily, as the customer. And so I'm not saying that never happens, but I don't feel like that discipline was necessarily there, at least from the seat that I had, the front row seat that I had. And so those are the types of things I will tell you that when I do meet with security vendors today, I will say to them, I need you to hear the frustration at that true end user endpoint, not at the purchaser endpoint. So that's the thing is, cybersecurity is somewhat disconnected from the true end user, because they see the CISO as the customer, and of course, they are one of the customers, but they are not the actual end user, absolutely.
Raghu N 18:59
And I think we do have a sort of this blurring between a capability, which is what can do and how it can do it. And we don't. We spend a lot of time on capabilities and delivering lots of capabilities, but we don't spend enough time that thing. How is this going to be used to actually solve an end user problem? And I know this is, this is always continues to be the big challenge.
Theresa Payton 19:21
Well, I'm going to give a simple example, and I know we're moving away from passwords. So we say, but I don't know about you. Have you ever been at a cocktail party and somebody hears, oh, oh, you work in security. I've been looking for the person to thank for long passwords, man, I love those things. Have you ever had somebody ever stop you and say, I love Oh, the more complicated you make it. I love it so much I'm
Raghu N 19:44
obviously going to the wrong parties, because no one has ever said that to me, me neither.
Theresa Payton 19:49
Me neither. As a matter of fact, usually they're like, let me tell you everything I hate about Yes, and then I hear all the things and I'm like, I know it feels like it's designed to keep you out and let bad people in. I know I'm sorry. And I just say we're, we're always continuing to improve and do better.
Raghu N 20:04
I'd actually say that. And this is, this has been said before, is, is that a lot of those practices, right? Essentially making it sound because I think we're sort of, because we like to think that cyber is this complex, very clever thing, then, well, if it's a very clever thing, it needs to be really complex. Otherwise, everyone could do it, and we've almost gone to we need to make this as complicated as possible. And as a result, those who we want to adopt are actually finding, essentially, ways to avoid it, and sort of in those gray areas, saying, actually, you know what, I've done enough, right? I can say that I've done enough, and I'll accept the rest of the risk. And we're not creating a culture where people are saying, I actually want to drive down my cyber risk.
Theresa Payton 20:42
Yeah, I think if we change our mindset. So I'll give an example. Every time I take my car in for service, I'm assigned a service manager, same name Brian. Love him to pieces, and he will say to me, before anything gets done, you're going to get a survey. I want to score a 10 out of 10. And so if anytime during this experience, there's anything that would make you want to give me anything less than a 10 out of a 10, would you please give me a chance to address it? Yeah, and he's been my service manager probably 10 years, and I've even told, like, the CEO of the company, like, don't ever take Brian away from my family, right? And so, but here's the thing, like, the first time he said that to me, I was like, Oh, is he gaming the system? But then I realized, no, he really meant it. And it opened up a dialog for on the honest conversation, for what wasn't working. And now there's like, a level of trust there that when I see something not working, I know I can tell him and it's going to be addressed, or I'll understand better why it can't be changed. I would love for everybody in security to start thinking about your stakeholder group and your end users and saying I'd like to get, I'm going to give you a survey, and I'd like to get a 10 out of 10, because what's going to end up happening is, is you're going to start to get this customer user story sent to you, scenario by scenario, and then you're going to know, okay, that's not I think it's a seamless, elegant design, but clearly you don't, because you gave me a two out of 10 or whatever, but give me a chance to fix that. And so I would just say, as we think about reimagining, I know how busy everybody is, and I know how stretched thin you are, and I know how hard this job is, and so I would say, look for simple ways to find ways to get these customer user stories, and find ways to maybe listen and you'll get the business case for it. So if you're able to go back to your executive team and say, I can improve our net promoter scores of our customers. If you give me this money to invest to do something different, we'll be safer, more secure, more in regulatory compliance, and I can get our customer SAT scores up. That's a winner.
Raghu N 22:58
Absolutely and happier customers. Do more with your product, right? Make themselves more secure. Don't actually evangelize.
Theresa Payton 23:05
Yeah, so one of the best in class companies I've seen, I won't name them, but they actually have customers who in their focus group have said, I am referring other customers to you because the security is so well thought out and seamless and easy to use, and I feel so much safer and secure, and I don't have the same problems I had at another institution. When your customers are evangelizing your security construct, you've got a winner.
Raghu N 23:34
Yeah. I mean, just imagine if every single vendor here at RSAC 2026 took that back with them and implemented that. How much more exciting the conference would be next year. Yes, because you're walking around thinking, oh my god, as a customer, I love every single one of these vendors and how they do something, and it's so useful and easy for me to adopt. Okay, so moving on from there. And you touched on AI quantum, literally. I mean, you just literally, sort of just name drop them for a second, right? So let's, let's bring the map between sort of AI and, however you want to think about that, sort of in the hands of the defenders, hands attackers, AI systems, etc. But also quantum and this concept of a post quantum world, which of these two gives you the biggest cause for concern when you think about securing it.
Theresa Payton 24:24
So this is a tough one. So can I say both? Am I allowed to have choose both, as long as you justify it? Yeah, I'm going to say both, AI right now is has your most privileged access and is your most worrisome insider threat and then on quantum Okay, so there's something we've all skipped because it's really a hard and gnarly problem, and that is data, data architecture, data lineage, data creation, data disposal, data, you know, enterprise data access to. Data labeling data is hard. Quantum means that the encryption of your data for certain things doesn't matter anymore in a post-quantum cryptography type world. So the question is, is, do you have data elements that, no matter what, have a forever shelf life. And have you classified those? Because that's where quantum is a risk, and you need to be redesigning that architecture. So I'll just, I'll give an example, because it's just something so public the secret formula to Coca Cola, the secret formula to Coca Cola has a forever shelf life for as long as Coca Cola is around. That formula is really important. So the question is, is, if it's electronic, maybe it's not. If it's electronic, I don't care what encryption you have on it, it's going to be busted. So if it is stolen today, and it's encrypted and it has a forever shelf life, what does that mean for from a data incident, data breach and data responsibility? So the lineage of data and its true value, for how long like is this a piece of data that, if it were stolen today, it's a problem, but two years from now, it's old. If it's a piece of data that's stolen today, but it's a problem, but 10 years from now, it's still a problem 50 years or forever, and we haven't done a good job, in my opinion, in most organizations, even the most advanced ones, really doing data classification. And a new layer, a new nuance on that is what is the true shelf life of data and its value over time,
Raghu N 26:41
So just on that. Would you say that’s the challenge? Well, the challenge with ensuring that we're quantum safe, right, is the fact that we don't know how bad the problem is. Like, is that the biggest significant challenge with that? Yeah, there's so many
Theresa Payton 26:59
unknown unknowns, right when it comes to data classification, data protection. So I always say to organizations, now is the time to think about, can I tokenize access to data? Can I anonymize data? Can I digitally shred data? Don't run afoul of any laws. So talk to General Counsel. First, you know, are there some things you could be doing now to get your house in order for post quantum so there are things you should have already been doing just knowing ransomware and data exfiltration is a thing, but as we saw with Stryker, who, through no fault of their own, got targeted by Iran, and they were successful. As far as we know, investigation ongoing. There wasn't data exfiltration, it was data wiping, yeah. So ai, ai, yeah. Let's come back to AI. So a couple of pointers I'd like to give because I like to be helpful. I don't like to just be like, it's bad, it's dark, it's ugly out there. Boom, yeah. So let's some pointers here. I would ask your AI vendors, do we have immutable backups and logs of what AI is doing. How do I make sure that AI is working in Socratic mode so my team can never trust and always verify that we don't have model shift and that we don't have AI poisoning and that we don't have AI manipulation. And then lastly, obviously, your governance team, who is your governance team, and is there a voice of the customer on that governance team who is empowered to speak up on behalf of the customer without fear of retribution by the rest of the governance team? Sorry to interrupt.
Raghu N 28:31
So when you spoke at immutable backups, right? Are you think? Are you saying in terms of not just like what data AI is access and how it's potentially like, manipulated it. But are you also saying that those backups also contain the reasoning process?
Theresa Payton 28:52
Yes, absolutely. AI has taken in order to reach a particular response, yes, because AI is going to go rogue, it's going to go rogue. And so you need to know from a secret, and it needs to be immutable, because you know what? Just like I've seen, I've worked on insider threat cases, sophisticated engineers delete their tracks, and it's really hard to prove they did what they did. And AI is a sophisticated engineer with privilege access and access that you don't actually know.
Raghu N 29:14
So that, right? So now let's put that you said, let's say insider threat, right? So over the last 15 years, we've had like, a range of technologies and approaches to around how we detect inside the threats. But given now the proliferation of AI and it's essentially use in like, how many times day to day, right? How do you then build the technologies that are able to keep up with your insider threat analysis of because it's not just a case of keeping those backups. It's really a case of actually looking at those at those event data, at those actions in real time, and saying this has gone rogue, because it. Be way too late. And, I mean, we're talking about sort of, not way too late in terms of days or weeks, but way too late in sort of seconds to minutes.
Theresa Payton 30:08
Yeah, it's a combination of things. And you know, because AI has tremendous possibilities, I'm really energized by the tremendous possibilities that the technology is going to afford to us, but I would say we already know how to do this. Everybody has third party vendor management, you have contractors, you have consultants, you have employees. You have to think long and hard about user access controls. Authorizations, are people stepping out of bounds in banking, for example, you have maker checker rules so the person who opens up a bank account on behalf of a customer can't then open up a different bank account on behalf of that customer, and then wire transfer the funds to themselves, right? And so there's maker checker rules around things like that. And so we already know how to do this. So what you have to do is say, just like I've had to think about people as insiders, yeah, I now need to think about every AI implementation as an insider. And so I, you know, when I see people are like, hey, I gave AI access to my subscriptions and my laptop, and I wonder, did you really give it access to everything? Because what happens when, on your behalf, it books you a trip to Tahiti and you can't go, I mean, are you really giving it all that? Are you just saying that so you get clicks and likes on social media? It's like, really hard to know, but I think the good news is, is we already know how to do this. We just have to think about this differently and design for it.
Raghu N 31:36
Let's use the sort of the current hype and excitement, but also fear around open claw, like put that in the context what you've said in the context of the current status with open claw.
Theresa Payton 31:50
So this is something for your governance Task Force to really talk long and hard about, and procurement better be your best friend. So provenance and software bills and materials are more important now than they ever were. So we think about something like open cloud. So there are a lot of amazing tools being built on these open models, and you need to understand the origin story of the country of where these open models come from. So that is not to say that these open models are bad and nefarious actors or their people are but just understand the geopolitics of the world right now, and really understand your software bill of materials. So if you're looking at the origin story of the technology for the open model, and it tends to be in sort of known, sophisticated nation states. You should be thinking twice about that. But my concern is, do companies even know? So that's a question for your vendors, you know? So it's like a quick questionnaire you ask them and say, Please list all the open models that you're using as it relates to this AI product and AI implementation, and do you have a third party that certifies that these are the only models that you're using and how like? So all the things we already know how to do, you just need to do a little bit more due diligence here.
Raghu N 33:14
So let's close the loop on a few things. So we said right at the top, you gave those statistics about security spending and cybercrime cost, right? And one very much outweighs the other. It's the wrong way around. We've also spoken about the real and present challenge with ensuring we're post quantum safe around how we manage and effectively govern and secure AI, so if we now connect it back to that whole the fact that the model is broken with all of these new challenges on us, and very much here, how, how do we rectify the model? I mean, it's not just about the user experience. It's got to be more than that to get to a place where actually that spending is generating a real like reduction in cybercrime cost.
Theresa Payton 34:06
It's conversations like this, and it's this right now the change that needs to happen. It's not going to happen at the speed of machine. It's going to happen at the speed of your network and trust, because we all have to be sharing with each other what's working and what's not working. The problem is, is the criminals are working at speed of machine and sort of innovating and being very creative about how to take advantage of us at this time. So it's conversations like this, and people really getting the opportunity under Chatham House Rules to say, well, this is what we tried, and it didn't work, but here's what we learned from it, and now we're trying something else that it does seem to be working, and it's really just going to be sharing those experiences from the trenches with each other that is going to. Propel us forward,
Raghu N 35:01
and that's got to happen at a speed that it's not happened before, right? Yes, hackers are moving at machine speed we can't be still operating at, let's say, glacial speed, right? No, you're right. So of course, we're our listeners are going to listen to this post RSAC, right? So when you leave to go back home Thursday, what do you hope to have taken with you based on your engagement with vendors, customers, partners, etc., like, what would you like to take away?
Theresa Payton 35:31
I am definitely looking for a renewal of optimism that we're all focused on the right problems, and that it's not just about more people, more money, more frameworks and more products, that it's really about how do we in this unique time in technology transformation, how do we take advantage of this To really leapfrog forward from where we've been. Again, I'm looking forward to the conversations because I do believe it will renew my optimism that we're up for this challenge, and that if we all share kind of our intel with each other again, Chatham House Rules and under the right regulatory frameworks, that that we're going to win, that we're going to it, and it's beating the machines actually, and making sure the machines are serving us and not the other way around. And as well as that, we're going to win against cyber criminals and nation state operatives and fraudsters. And we do have a unique opportunity. Now, I really feel like on the offensive, a defensive side, that we can have the upper hand if we all share best practices.
Raghu N 36:47
I mean, we have to have that, that hope, right, that, yes, this is, this is a battle that we can win, versus just accepting that, oh well, we can only do so well. So actually, just a fun thing. Then to finish with, I'm sure there'll be lots of, like, marketing hype and marketing buzzwords. I mean, it wouldn't be RSAC if it wasn't, if you could go around all the vendor booths, right, and take, let's say, a bit of masking tape and put your masking tape over some particular buzzwords. So I just never want to see these again. What would they be powered by? AI, I hope you've got a lot of masking tape.
Theresa Payton 37:25
Maybe it's duct tape that we need exactly, exactly. My father raised me that if you can't fix it with duct tape, it's probably not worth saving.
Raghu N 37:33
Maybe we'll sort of walk around with like a cart of duct tape, because we'll need it pretty much every booth here. Well, Theresa, thank you so much for joining us on the segment illustrious background, and just hearing that experience and how kind of very much being on the inside and seeing how so many of the things that we're seeing as reality today, you've seen those things being shaped and also offering that message of hope and about how very much this is a battle that we can win versus just surviving, I think, is incredibly powerful. So, Theresa, again, thank you so much.
Theresa Payton 38:09
Oh, thank you for such a fabulous conversation, and hopefully we'll have one again soon in the future.
Raghu N 38:14
Perfect. Alright. Fantastic. Thank you. Thanks for tuning in to this week's episode of The Segment. For even more information and Zero Trust resources, check out our website at Illumio.com you can also connect with us on LinkedIn and Twitter at Illumio, and if you like today's conversation, you can find our other episodes wherever you get your podcasts. I'm your host, Raghu Nandakumara, and we'll be back soon.

