The webcast, hosted by Compliance Week and HALOCK, featured Chris Cronin discussing the SEC’s cybersecurity rules and the importance of Duty of Care Risk Analysis (DoCRA). The SEC’s requirements for companies to describe their cybersecurity program in detail and the challenges in communicating it to investors were highlighted. The communication flow in cybersecurity risk management, risk assessment criteria, and the concept of reasonableness in cybersecurity were explained. The importance of consistent reporting, governance involvement, and legal history in defining reasonableness was emphasized.

Analyze Your Risk

 

TRANSCRIPT

Hello, and welcome to today’s webcast brought to you by Compliance Week and HALOCK.

I’m Adrian Appelle with Compliance Week, and I’ll be your host.

Today’s webcast is Almost Everybody is Unprepared for SEC Cyber Security Disclosures, but You can get Through This.

Before we hear from our presenter, Let me review the agenda.

We’re scheduled to go for one hour.

After the presentation, we’ll have a question and answer session. Your questions will be kept confidential and anonymous so please do send them in. You can ask your questions at any time by going to the ask a question function on the left hand side of your screen.

I’ll post them to our guests at the end of the presentation.

After the Q&A, the webcast will end.

The webcast will also offers CPE credit for all attendees.

Please use Google Chrome or Firefox as your internet browser and disable your pop up blockers to ensure access to the exam.

Once I have signed off and the webcast is completely over, The final examination will be be presented automatically in a separate window.

If you have trouble viewing the CPE test, or receiving your credit, please send an email to webcasts at compliance week dot com.

Again, please use Google Chrome or Firefox as your browser.

At any time during the presentation, listeners can download the slides from the drop down menu on the left hand side of the left hand side of the screen.

We welcome your thoughts as we always are looking to improve. So please send comments in as well.

If you want to increase the slide size, you can hit the view slide full screen button, which is at the top right of your screen.

Lastly, a help button is located in the upper right hand corner of your screen if you need assistance.

I’d like to welcome today’s speaker. It’s my pleasure to introduce Chris Cronin, partner at HALOCK Security Labs.

Chris has over fifteen years of experience helping organizations with policy design, security controls, audit, risk assessment, information security management systems within a cohesive risk management process Great to have you with us, Chris.

With that, take it away, Chris.

Thank you.

The, we’re still on the today’s speaker side. I’m just gonna I’ll bounce back real quick. I’m gonna, just take a moment to ask why you really would even care to listen to me. I’m just gonna walk through something because I know we got a wide variety of audience here. Yeah. So develop something called Duty of Care Risk Analysis (DoCRA), and that’s gonna be integral to the discussion we’re gonna have today.

I’ve authored, papers, including the Center for Internet Security risk assessment method (CIS RAM), partner at HALOCK Security Labs and at Reasonable Risk. And what you’ll see is some, some samples of the work that we do with our clients that got them ready for the SEC rules before they even existed.

So, what, what I wanted to particularly point out is that I started out in technical operations and then I went to audit. So I know in terms of Compliance Week, we have a lot of folks who are in the audit field and compliance field. They’ve lived that life coming from technical operations. Went into cyber security to specify a certain area of of compliance and audit, and then developed a level of expertise that does come in as expert witnesses? So everything that you’re gonna see in in in this deck today is focused on that sort of life experience but specifically in working with regulators to help them figure out how to talk to you about risk.

So we’re gonna talk about how the, securities and exchange commission has raised the bar.

And, you’ve seen that, but we’re gonna talk about specifically what what that means There’s this rise in the use of the word governance for cyber security. We’re gonna talk about why that is. We have both standards bodies and regulators asking for it now. And then we’re gonna talk about how you’re gonna prepare. During this session, HALOCK is going to be revealing a little bit of our secret sauce. There’s something that we’ve been doing that’s gotten regulators to get comfortable with Duty of Care Risk Analysis (DoCRA) as the definition for reasonable security.

We want you to, we want you to be, aware of how that works so you can start to use it for your favor.

So why are you here?

We know some of you are interested in the SEC rules.

You’re with public companies. You’ve got that role and responsibility.

Some might be here because, you know, this is a topic that’s that’s interesting.

That, that you’re gonna have to be aware of as part of any kind of regulatory regime that you’re responsible for. Or you’re just here for CPE credits. We’ll try to be equally entertaining for everybody.

But I want you to be aware that everything that we’re talking about here, although it’s in the framework of Securities and Exchange Commission rules, which we’ll explain in some detail. We’re finding the success of so many other regulations like this that we’re confident that when our clients have taken what they’ve been doing for their normal, risk management and and and compliance activities, are gonna do just as well with the SEC. We’ll show that show why that is the case.

So, yes, the SEC has raised the bar.

What people have been saying for a long time is, we have good cybersecurity policies, and we have good cybersecurity controls.

And the SEC is saying terrific, show us.

It’s time to tell the, your investors what those are. If you’re gonna say it, show it.

In other words, don’t say something if you can’t back it up. So that’s where people are now.

It isn’t telling you what to do. It’s telling you that you must describe to people what you do. Right? Except for one little thing about, you know, the materiality disclosures of an incident. That’s a little bit of what to do there.

The big question we’re gonna ask for today, and those who have taken a look at that SEC final rule will will notice a few things that just seem on their face unimaginable.

The big big question that should hit people is how can you possibly describe your cybersecurity program with sufficient detail that a reasonable investor would understand it while not sharing too much. And this has been if you look through the the the final rule comments, you’re gonna see that there was a lot of discussion and concern about this. How do I tell someone who doesn’t may not know anything about cybersecurity all. How do I tell them how cybersecurity risk management, governance, and strategy work, or how materiality is determined, that they would understand it even.

Frankly, in HALOCK’s experience, most organizations don’t know how to describe risk assessments and risk management and strategy and governance So we’re gonna show you exactly how to do that. You’re gonna see at the end of this presentation, I wanna be sure I’m doing this right, navigating correctly. You’re gonna see by the end of this presentation, that we’ve got answers to all those questions.

You’re going to conduct a risk assessment, a real risk assessment, You’re going to define risk assessment criteria that includes, impacts to your corporate goals, to corporate mission, your obligations to protect others.

We’ll show you how that works. You’re gonna measure and communicate risk status problems and progress, you’re gonna show and tell a consistent story, but how materiality and reasonableness work throughout the organization.

So all of this says, you know, wait, that’s a lot of detail. Thanks, but I don’t know what any any of that means. We’ll show you what that means.

And we’ll do it by showing you this very complex diagram or this chart. This is going to be a little intimidating if you haven’t seen something like this for the first time. We’re about to spend some time during this presentation to show you how each of these things is built so that you’ll be able to have a clear piece of evidence that your risk management, governance, and strategy are all working, but you want to be able to have a way to communicate with executives about what’s happening with with cyber security in a way that they can make a decision that’s appropriate for their level.

And here, at a very high level, you’re seeing something that will look at in a little more detail a little later on, tells us where we are against our level of acceptable risk, what our plan is for getting to acceptable Are we varying from our plan and what do we need to get back on plan? If you can have that kind of conversation, in your organization, then you’re handling just about everything that an investor needs to know about your program. Right?

Alright. So we’re gonna start this early with the survey question to get your juices going and get you familiar with interactivity.

The first survey question is how prepared are you for disclosing your cyber security program? We wanna get a sense of where people think they are now, before we go in too deep and and, and look at the particulars. We’ll give you one minute to answer this, and we’ll get back on program. How prepared are you for disclosing your cyber security program?

Okay pencils up.

Onto the next slide.

Alright.

Let’s define these terms. So, cyber security, risk management strategy, governance, and incident materiality are terms that are all used in the final rule.

We don’t want this to be ambiguous. We wanna be real specific so you’ll know what you’re saying for each. We’ll take these one at a time. Risk management.

And by the way, you know you have a CPE exam at the end of this. So I’m telling you pay attention to, to to certain slides. And this is a good one to pay attention to. We want you to do well.

What is cyber security risk management?

So let’s start with the bottom here. Risk equals impact times likelihood. Right? You’ve seen this before. This means something in cybersecurity risk.

Some people will say, materially I’m sorry. Some people will say risk equals probability times magnitude of harm.

Some will say impact times likelihood times vulnerability. People have different ways of calculating this. But risk is always some component of impact and likelihood. It’s not a maturity assessment.

What is my score? One, two, three, four, five? It’s not a gap assessment. Those are helpful, but the risk analysis is a question about What do I foresee as being a problem?

How likely is it to happen and what would be the effect?

If I do that, if I know my current risk, that’s the start. If I know my target risk, where is where would I be okay? Where would everyone else be okay? That’s important.

If I have a plan for my safeguards so that I know how we’re going to reduce risk by implementing them, then we’re gonna be okay. Now then providing the right resources, part of management is making sure that people have the right priorities, the right budget, the right staff to get them what they said needs to be done, and then reporting your progress to people who can make decisions. That’s risk management.

Alright?

Now we draw a distinction between that and cybersecurity strategy.

Strategy is a way that the organization says we are going to operate in cybersecurity risk management in a certain way. We’re gonna use standards for risk analysis, standards for how we manage risk.

We’re gonna make sure that risk management is a part of the organization the way everything else is. So when you think about strategy, you’re thinking about finding specific standards that will drive your behavior to be correct, And to do it in a way that the rest of your organization, would manage any other risk. We’ll we’ll illustrate how that works in a little bit. This is an important part of the SEC rule. Are you integrating cybersecurity risk into your business? In other words, are you handling that risk that we would handle any other risk?

Right? So strategy is that selecting the standards and, making sure you’re integrating this into your organization, then governance So the simplest way we know to talk about governance is to say that, the right level of management is making the decision that’s that they can be effective in their management role. So, let me read that word for word. Responsibilities for cybersecurity are at the level of management whose role is to effectively manage that risk. If I’m telling an executive that they’ve got responsibilities, the executive doesn’t have responsibilities to make sure patches are implemented on applications nor is it the responsibility of a systems administrator to be sure that they’ve that the organization understands all their regulatory responsibilities.

That wouldn’t be good governance. Governance is the right level of management making the right decisions at their level where they can be effective. Okay?

And now what what’s really important to understand about why this concept of government governance is rising is we’ve been on a lot of breach cases where we look at evidence of how the organization that was breached was being managed And we find that that the executives, management, technologists are not speaking the same language.

They’re often not even being honest with each other. We’ll look at presentation decks, at executives, levels. We’ll be looking at, audit responses we’ll be looking at evidence of how the organization became aware of what their responsibilities were, whether there was an intolerable problem, what they did about that intolerable problem, and we always see governance failing. People are not making the right decisions at the right level.

And and and and that that this is making it impossible for executives to know how to make a decision is making it impossible for someone on the ground level to say I need help. I can’t get this done unless I have the right number of colleagues, the right tools, the right priority.

Right? So now governance is a matter of saying the organization’s response not just any one person.

On the materiality, we’re gonna talk about this idea of showing and telling a consistent story with materiality.

I’ll describe materiality for you in just a bit, but the first thing I wanna really draw on is the concept of materiality having a role in everything we just talked about. Materiality very briefly would be. If everything’s okay and what you expected it to be, it’s not a problem. If something has broken in a way that someone, like an investor could be harmed, it’s a problem.

It’s not the only definition for materiality. You’ll add other things to that. For example, at the bottom right, you’ll see corporate discretion range. You’re gonna have any number of things that you’re going to use for your definition of materiality.

You’re gonna put materiality into all of your processes to make of consistency in risk management, strategy, governance.

If there’s an incident, manage the the executives who are part of that incident disclosure are going to want to know what did we say material was? What have we been managing to? Because if we’ve violated that. If we broke that, that has to be part of my determination.

It’s not the sole determination because what we’re being told in the final rule is we’re looking at qualitative and quantitative impacts, things that might hurt us, things that might hurt others, things that might hurt investors, materiality inside a program, as you’re gonna see, is a is something that can be aided by your cybersecurity program. But the final discretion is at the executives who are deciding whether to disclose or not. But you want this definition of materiality to be consistent throughout so you don’t contradict it. What you don’t want to do is have evidence that you run your risk management goals based on, a materiality definition.

Your strategy told you to reach a certain level of impact. Your governance program made sure that everyone was making those decisions at the right level toward toward managing that. But that that goal wasn’t considered during the incident disclosure.

You don’t wanna be sure you you wanna be sure that you don’t have that contradiction. Okay?

So it plays a role, materiality plays a role, but the executives at the final discretion, at disclosure. Alright?

We see governance coming up in the SEC, rule. We see it coming up in in state regulations. We see it coming up in the, a NIST cybersecurity framework two o draft. We’re seeing it come up a lot. We’re gonna see it coming up all over the place. It’s to make sure that the right level of decision is being held at the right level of the organization. So we don’t, you know, blame people who really had nothing nothing they could have done about it because they didn’t have decision rights.

A way that we that we talk about cybersecurity, governance is is with sort of an overlay on an org chart. This is an org chart for a hospital.

And this is what you’d want communication to look like in in a well fuartments It knows what they’re supposed to do. It can report up to management to say whether things are going well or not or things that they need or opportunities that they have. And anyone anyone at leadership would be able to have a conversation because they would come from, billing. They would come from pharmacy. They would come from medicine, they know the areas of expertise of how a hospital works.

You would know if cybersecurity had a good governance function if you talked about things in terms of risk goals. Right? If you if you talked about things in terms of, in in terms of cybersecurity in a way that the executives can make a decision. What we typically see is something different where you have competence in governance everywhere at all departments cybersecurity is a standoff. If you’re to go to an executive and say, tell me what you know about cyber security, and if they tell you Well, the ransomware is a problem. And, hackers are a problem, and there’s this guy on this TV show, then you know that they are not handling governance well. They need to be able to say something informative as informative as they can say about anything else.

If hospital leadership or any kind of organization, if their leadership asked about an area of confidence in their organization to say, you know, how are we doing with, with emergency room intake? Oh, emergency room intake is actually going quite well. We developed this new process for triage.

We have a really better way to figure out who needs to go where, and then so, yes, our our our our numbers are growing more efficient. Right? That they know how to have a conversation about that. The Securities and Exchange Commission is saying cybersecurity needs to be integrated into into your business just like any of those other things. Right?

I saw this a couple years ago when when first happened. Some of you, some of you may be aware of solar winds. It’s a very important cybersecurity tool that’s offered as a service. And they had a breach, some time ago, a couple of years ago, and it was a real big deal because everybody especially industry and government use solar winds. So if they were attacked, it meant that the the the bad actors had access to a lot of organizations who used solar winds. They were they were congressional testimony. There were, there was a lot there was in investigations.

And I saw this snapshot on CNN, and I knew it was start to use it on their website. There were two things happening here.

One is the statement, Solarwind CEO blames intern for SolarWind’s password leak.

That’s terrible governance, right, for a couple reasons. One is, you shouldn’t be blaming an intern for a failure for the whole organization. Can you imagine if, if your financial reporting was off significantly, if you had a material failure in in the accuracy of your financial reporting. They said, oh, that there was an intern who made a mistake. So that’s that’s their fault. That would clearly be bad governance.

Good governance would be, we have to be aware. We’ll be admitting this. It would be saying, we realize that the the the the framework for our application and our services and our controls permitted a foreseeable error to to, compromise our clients in our in our in our entire system, and we have to fix that.

That would be good governance because the CEO would talk about this from the level of that CEO’s abilities. The CEO can drive the better policy, the better framework, the better service.

It can’t be the intern. That’s terrible, terrible governance. And and he was appropriately humiliated for making that statement. At the same time, Kevin Mandia, whose photo is here, did a great job.

He gets a trophy. Because he was saying and this is, like, the breach disclosure rule, saying, or the incident disclosure, rule saying, look, here’s as much as we know. We know this much, and we’ll find out more over time, and we’ll keep you attuned. Right?

That’s all the SECs telling you to do. Within four days of determining its material without unreasonable delay, tell people what you know, and then give them updates as you get updates. And if the incident if you learn more about the incident over time, you’re gonna include that in your subsequent ten ks. Right?

And if you find something that’s material that’s changed, it goes into another eight k. But you’re gonna tell people what you know and you don’t know. So in this one screenshot, we saw a terrible example of governance. Like, laughably bad, irresponsible.

And then another that was that was, undesirable, but it was the right thing to do. You don’t want to not know. But if you don’t know, you tell people you don’t know, and you tell them what you’re gonna do to find out. Right?

So understanding the new SEC cyber security rules. So before we get into some more of the we’re we’re about to go into each of the clauses, Okay? And we’ll talk about them. We’re gonna paraphrase them to make them easier for you to understand.

But this is something really, really important for you to understand. Because you’re gonna need to start to communicate in your organizations what this rule means. It’s true you’ll have disclosure rules.

It’s true you’ll have to make determinations of a materiality of incidents. These things are you’re gonna have to add item 1C in your 10-K’s. All true.

But what’s very important for your organization to understand is this paragraph. There’s a really interesting precedent paragraph to this too. I called this one out.

It says to the extent investors view strong cybersecurity risk management strategy and governance favorably, registrants disclosing more robust processes, more clearly could do could benefit from greater interest from investors leading to higher market liquidity relative to companies that do not.

The SEC is saying, we know there’s a lot of short term thinking in the markets. We know that people are responding to quarterly plans, and the and organizations are gearing themselves to quarterly plans.

You cannot run cybersecurity as a quarterly plan goal.

This is a long term goal.

And this gives you an opportunity to explain to people what you’re doing long term for a long term value benefit.

This is very, very interesting. And this is a very useful way for you to think about why you’re doing what you’re doing. It enables you to tell the marketplace why you’re making certain investments to improve the outcomes of their investments.

Okay?

Keep that in mind. It’s one of the most interesting paragraphs out of that whole final rule.

Let’s take a look at what the specific requirements are. We’re gonna the the the the top part is the actual quote from, from the rules. I’m gonna be reading to you how to interpret this.

Please tell investors this first clause in a way that they understand how you manage the cybersecurity that may hurt them.

There’s a lot in there.

How do you tell an investor in a way they will understand what you’re doing? But it also says you know, when you when you read the final rule, these are incidents. These are impacts that could impact them. Insiders, outsiders qualitatively and quantitatively. Okay. We’ll keep that in mind, but we’re gonna tell you how to do this and how our clients have been doing it for a while now.

The next, Please tell investors in a way they will understand how you make cybersecurity risk as important as the other risks you manage. Go back to that. Or chart in that hospital.

We want executives to be able to make as clear a discussion about cybersecurity as they would at every every other part of their organization.

How do you do that?

Please tell investors what expertise you rely on.

Hey. If you have inside expertise and they’re terrific at this, that might be okay. If you don’t, we wanna know that you’re talking to someone who actually knows how to do this, their expert. The investors should know that. Right? But they should know that for anything that you’re doing.

How to interpret the next? Please tell investors whether you consider third parties who pose risk to you, as a risk sorry. Please tell investors whether you consider third parties who pose a risk to you as a risk to your investors. So if you’ve got third parties and you’re saying, look, I actually don’t know how good they are at what they do, but I really, really depend on them. No one else does what they do the way they do it, but I’ve got no way to be sure that they’re secure. That’s the kind of thing you disclose to investors.

Now you’re not going to be really specific to say, Acme corp has this problem, and that they keep hiring you know, re reform reformed criminals to handle your credit card information. And we don’t know if that’s the right thing to do. We wanna be sure that you have, that you’ve got a good way to, that you got a good way to describe this. We talk about being concise yet prudent. Right?

Next, how to interpret this next one? Please tell investors about how any current or previous incidents should inform their voting or investment decisions.

So I we need you to know that we had an x issue, it was not material. It does not have any indication of a root cause that should resurface.

It did not cause the following issues. It caused the following issues. It should not be material. That’s a disclosure that would be appropriate for this.

It would also be appropriate to say that incident we told you about in our 10-K filing in February, we discovered what the total impact was in its X. And we’d be able to offset that by doing x y and z. Insurance is covering this much. It will take the next three years to recover functionality and to invest in a system so that it’s impervious to that kind of attack again.

But you’re gonna be clear about, about any impacts from from current or previous, incidents.

Next, please tell investors whether management who are responsible for running the company are involved in cybersecurity risk that pose a risk of harm to investors. This is interesting. So I’m considering this the remember I say, show and tell a consistent story. This is one of these things where consistency really matters.

So this is beautifully architected. You know, when you read poetry, you read lyrics to an album and something comes up and you go, wow. That was a beautiful tie in. This is one of those moments.

This is a very elegant triad here. Tell investors whether management who are perceived who are responsible for running the company, They’re involved in cybersecurity risk risk management, really, that pose risk of harm to investors.

Because the next one is Please tell investors which management executives, or director positions are involved in cybersecurity risks and what their expertise is.

So they’re involved.

What do they know? Are they capable of handling these things? And then please tell investors how management are involved in cyber security incident management meant.

Okay. Now this is getting really interesting because we’re saying which men which which management and executives are involved in cybersecurity risk management?

What is their capability of doing so? And what is their involvement in managing incidents as they occur?

Because what’s happening here is, it’s saying, if you’re going to be telling us what lines we’re trying to draw, what is our risk goal what can we tolerate and not tolerate as impacts to us and to others?

I want the people who answer that question to be involved in risk management to to have some kind of confidence for having a discussion about it, and to be involved in incident management. So when an incident happens, they say, oh, Hey, this incident actually did not cross the line about our goal. We should be okay, or it did.

Because if you have different people having discussions about what the risk goals are, who should be able to make what decisions about resources, and and an incident someone else is saying is this material, that now creates disconnections.

An any reasonable investor looking at a program that says, Jane makes the rule, Gus, determines whether or not this incident, was material, they’re gonna wanna know, do Jane and Gus talk Do they have any idea that the each of these set the same rule? And if your governance program does it doesn’t say anything about Jane and Gus talking, the reasonable investor would say, I don’t see I don’t see a through line here. How does what Jane say, it says, make any sense to what Gus is supposed to do?

So now you’re starting to see this really interesting part of of the rule where consistency is really gonna pay off on how you communicate to a stranger.

And then finally, please tell investors whether management reports incidents to investors. This is just a requirement of of the rule. But now if a management has set the goal, they’re competent to recognize when they’ve crossed the line, they they look at an incident to see did we cross the line and then they tell people whether they did or not. This is this interconnection you want. This is that line of consistency. Alright?

I’m gonna show you one more slide. And I think one or two more slides. And we’re gonna go to another, yeah, two more slides. And we’re gonna go to another survey question. But this is something that haylock has been showing our clients for years.

Twelve years or so?

About twelve years or so.

This is a map of information that you want to flow through an organization that does cybersecurity risk management. If you want this, my contact information is at the end, Take it down. I’ll send this to you.

This is basically saying that middle tier is management, the bottom tier, are the team who are doing the work, the top tier of the executives. They each have a responsibility for cybersecurity. They each the that that that management is supposed to know this is what our obligations are.

Based on risk and based on regulations.

We need to report that to to the team who are actually implementing controls to let them know the level of control they have to meet. They have to report back to back up to us whether there’s anything preventing them from actually implementing the controls that we told them need to be in place.

If there’s something going well or not, we have to report that up to executives at a level that they will understand or or the board at that they will understand where they can make a decision. We’ll show you what that looks like in a bit. But we’re having these communication flows up and down where someone says, hey. I know you told me what to do. But you also told me to take care of integration after this acquisition.

And one beat the other, and now we’re off of our cybersecurity plan. I need you to know that. An executive say, yeah. We know that.

You let us know. Thank you. When we have to tell the the board how we’re doing, we’re gonna let them know. Look, that acquisition came without resources.

Which took us off of our plan. That’s just what we should expect. Next time we do an acquisition, and we’ve got an integration, we need to be sure that we got enough resources to do this and not take us off our responsibilities.

That’s governance through risk management. Right? The right people making the right decisions where they can where they can be effective.

Now a lot of people will look at this and say, gosh, that looks complicated.

But here’s the point the SEC is trying to make. You already do this for every other part of your organization that you say matters.

This looks like what you do in finance to handle accounting.

And budget management.

Think about anything that you do in your organization that has a risk and control associated with it. Someone sets the requirement, checks in with the team to say, You gotta get this stuff implemented in this way. I gotta get feedback to see how it’s going. If I know there’s a problem, I gotta encourage people to fix it and give them the resources to do it.

They’re telling me how things are going. I report back up. If governing if if the board is concerned with how things are going, They say, you gotta do something about this, and executive management does something about it. It’s all a loop of communication.

You already do this.

Again, if you want this just Email me. You’ll see my address at the end. I’m happy to get it to you.

Okay. Now survey question.

Time for me to give my voice a break. This next question, how would you describe I have to rephrase this. How would you describe your executive engagement in cyber security management’s governance and strategy. So how does this work in your organization?

So forgive the lack of the word you are, but how would you describe your executive engagement in cybersecurity risk management, governance, and strategy I’ll give you about a minute.

Okay.

That is your minute.

While, while I gave you your moment, I was able to look at a couple questions coming in, and I’ll tell you pretty quickly.

Just some answers that are coming up.

Your to to know who’s responsible for the following the rule, take a look at the SEC guidelines. You’re gonna see these are registered public companies.

So, you’re going to see separate, rules coming in for registered investment advisors they are not out yet as far as I know, but these rules do not apply to them. But, again, talk to your general counsel about that.

What board committee should be involved in this? Well, the board committee, that decides they they should, So we see some, executive is we see some enterprise risk management committees. We see some audit committees stepping in, but that will be up to them. Alright? So I wanted to answer those pretty quickly, and we’ll have time for more at the end. Okay. So this is, we’re gonna go into what we mean by risk analysis here.

This is actually a picture of a bad, disclosure is so in this case, there’s been a breach.

And an organization gets this, gets this terrible message.

I’ll let you read that second paragraph, but it basically says if you can read it, we want you to know how important our profits are to us. We manage our risk program to manage our profits, and we invest up to the point that our profits will be hurt.

This is actually not a disclosure letter for real, but if your risk assessment program is focusing on risk just to yourself, This is what your disclosure letter will look like. You should not be managing your risk just to protect your profits. That’s what gets the regulators mad.

This is another way we describe this. Risk analysis is about likelihoods, and it’s about impacts.

Some risks some risk analysis is quantitative.

Some risk analysis is qualitative.

But cyber risk analysis misses the point if you forget the risks you impose on others.

It’s okay for you to make decisions on risk based on your exposure and your risk tolerance. But if you do that and you’re taking someone along the ride with you, you’re supposed to be evaluating the risk to them as well. And you need to be informing them about it. This is an important concept when we talk about reasonableness. And when you see in the cybersecurity rule, you’re gonna see a lot of mention about the, about impacts to insiders and outsiders qualitative and quantitative.

This isn’t just about how your bottom line is doing. Are we hurting other people that has to go into those. Okay? Quick refresher then. Risk is impact times likelihood or some variation of that. But there’s always seeing into the future, having some forethought, about a bad thing that might happen and what the likelihood of that is and what the impact would be.

The idea is that given our current state, what is the likelihood of foreseeable throughout and what would be the impact if they occurred. Right? And then what is the level of risk we want to get to? Well, we wanna get to a point where everyone will be okay.

So I mentioned CIS RAM, the center for internet securities risk assessment method, and DoCRA, Duty of Care Risk Analysis. They both have the same concept. CIS RAM is freely available if you go to center for into that securities website. And it gives you instructions for how you’re going to define your risk assessment criteria. And you’re gonna look at impact to, your objectives. What is it you’re trying to do? This is an example of, say, a health app.

They they wanna be profitable.

The the mission is user health. They want their users to become healthier. Their obligations are user privacy. We have an obligation to make sure our user’s information is private.

And then they define acceptability of impacts by giving everyone the chance to say, I’m gonna be safe at two. I’m gonna be unsafe at three. This is just a way that CIS RAM does this. Some people do this with quantitative risk analysis. This is like a semi quantitative or qualitative analysis.

But what what you’d be able to do is say, I’m gonna define how I know each of those impacts are okay and how they are not okay.

If I manage my risk management and my governance and my strategy using this common definition of risk impact everywhere, Then when an incident happens, I can ask, did this incident cross this line? Did we go into this unacceptable impact? And then hand that information to the body and the organization is gonna be making the disclosure to say whatever you decide, just know that this incident did, or did not cross this line. Right? That’s gonna be helpful at least for consistency.

But if you’re doing your risk analysis this way, and you’re looking at acceptability this way. And we really recommend you look at CIS RAM and you look at DoCRA, you’re gonna get a really clear picture of what the regulators are going for. You can, if you have a risk that’s high, you shouldn’t be using a safeguard that’s more burdensome to you than the risk reduction would be for others. It’s a balancing test. That’s what reasonableness is. But if you’ve come into that conversation without understanding, you’re gonna do very well.

We’ll tell you why in just a bit. But we’re gonna take one more moment for a survey question before we go into risk management, and how those metrics pay off. The survey question is, how are you conducting cyber security risk assessments now We’ll give you about a minute, and then we’ll jump back into the remainder of the presentation.

Okay. That should be enough time for that survey. We’ll wrap up with these final few slides.

You’ll now recognize, this slide. It’s a little less complex than when you first saw it. But the idea here again is this green horizontal line is a definition for what acceptable risk is.

And if we have a clear strategy, we’re always going for that acceptable level. And we use those criteria we just looked at as our as our definition for when we know we’re safe We can know by doing a risk analysis whether we’re above that line at or below the line.

When we realize we’re above it, what we do is we put plan in place? What safeguards do we implement over time? Every time we implement a safeguard, our risk goes down so we can actually state in a plan how well we’re doing, and we can report that up to executives and board members who have no idea how cybersecurity works. But they can know, look, we had our risk that was too high.

We had our risk goal that was much lower. We have a plan for what that acceptable getting to that acceptable goal. But here’s what becomes very, very helpful at these meetings. And what we very rarely see when we’re working on a breach case or, or or working with regulators, to understand what’s happening.

We rarely see this discussion where an organization says, I’ve got evidence that month by month quarter by quarter I checked to see how well I was doing according to my plan. And when I found out that it was off of my plan, I talked to executives about it, and I said I’m off the plan because we didn’t get the higher we needed. We were too aggressive in the plan. We had that acquisition. We had to address. Something took us off plan, or we didn’t get the money and the budget for the tools.

You need to know executives.

That this is the risk exposure we have.

At that point, they can decide because it’s good governance to say, do you now get what you need to get this done? If they don’t, it’s not up to the, cybersecurity, practitioners to fix it. It’s not up to the auditors to fix it. It’s up to the executives to say, we think this is important enough for us to do something about it. That’s what governance is.

So when you can provide this kind of evidence and again, our clients have been providing evidence like this for years because this is what risk management has looked like for years.

You’re finding yourself in a really good position.

Wanna be sure that I navigated correctly. So one of the things that, that we’re trying to get, trying to get to here is, is this idea that, regulators are agreeing with this concept.

And since twenty twenty one, we’ve seen a lot of cases and a lot of regulators and and and and, and authorities, think tanks, saying, we figured out that this duty of care risk analysis is what we mean by reasonable security.

Sometimes it’s through it through an injunctive order. Sometimes it’s through a regulate a regulation where if you look at Colorado’s privacy regulation, there’s a whole paragraph called duty of care now in how you do your risk analysis using these principles.

But what they’re doing is they’re saying, if you can show your risk to other people and your risk to yourself, are offset by the right level of investment that you’d where it’s not too burdensome, then you’ve demonstrated reasonableness.

So it’s become a really big solution.

How has this happened? Well, it’s happened because of this concept, and this is that secret sauce I told you about.

Regulators are vague because they’re not allowed to be more specific than a statute that authorizes them allows them to be. So if a reg if a statute said, Hey, Department of Health and Human Services, Hey, SEC, when you say reasonable, use this definition, use this math. They don’t do that, which means the regulators can’t define reasonable.

But they say you figure it out. Well, now you have Duty of Care Risk Analysis and CIS RAM to do that. You can use that vagueness to your advantage.

Right? Base your definition of reasonableness and materiality in law and standards and practices, things that already exist. I’ve told you about and CIS RAM. They’re both freely available. Use them.

When you’re held to account, something goes wrong and you’re talking to the regulator, Just say this is my definition for reasonable.

It follows what regulations and litigation have said about reasonableness for generations The reason why the regulator can’t fight back refers back to rule number one, they’re not allowed to say what reasonable is. They’re just allowed to tell you to get to reasonable.

In other words, when you have your definition for reasonable and it’s based on this legal history, this this culture of law, then your argument prevails.

And this is where you want to be. Alright? So make sure that you do this. If you don’t use it, then they say you were supposed to define reasonable and you didn’t, and they can use that against you.

Alright? So this is part of the secret sauce. Now we feel very strongly about how this will work with the SEC because as our clients are using this in their program, Here’s what they’re now saying in their disclosures. All we’ve had to do with our risk management program clients is to say, here’s how you’re gonna describe what you already do in your disclosures.

Again, we’re gonna go back down to the bottom. DoCRA users say, our risk analysis And I want you to think about yourself as the reasonable investor. You don’t know about risk assessments, you don’t know about cybersecurity, but you see this disclosure.

Our risk analysis evaluates potential impacts to three factors, our business objectives, our mission, our obligations to to prevent harm to customers, employees, and investors.

We invest in safeguards that are no more burdensome to us than the risk reduction they create for others. Our risk management plan targets risks that would not require repair to any party.

Now you have a clear understanding of what they mean by risk management.

Right?

The executive committee evaluates the materiality of cybersecurity incidents in part by determining whether the incident cause causes an unacceptable impact as defined by our risk assessment criteria. Our mission, our objectives, our obligations.

So now the investor is saying, you told me that you manage your risk based on these criteria, making sure that nothing would hurt me, and you analyze the materiality of an incident using those same exact criteria.

You’re managing for me and you’re disclosing on my behalf.

Right? And then finally, the executive committee sets criteria and goals for reasonable risk using the three impact criteria for mission objectives and obligations. They provide resources and prioritization for reducing those risks and improve with improved safeguards and processes.

They receive nontechnical updates about the progress of risk reducing activities and issues that must be addressed.

They alter plans, resources, and priorities to ensure that they meet our risk goals At no point, does it say, you know, they’re making sure that patches are up to date. They’re making sure that third parties, have no vulnerabilities in their applications.

That’s not the appropriate role. Governors dictates that people make decisions at the level where they’re capable of creating an impact at their management level. Alright? So it becomes very simple to understand how to disclose this in a way that a reasonable investor would understand if you yourself understand how you do it is the long and short of it. Remember their big answer for the day? Conduct a risk assessment and a real one? Looking at risk to you and to others, making sure you’ve got a definition for what would be what would be acceptable to all parties, not just high, medium, and low.

Define risk assessment criteria that includes corporate goals, corporate mission, and your obligations to protect others. Right? That’s what you’re supposed to be doing. The final rule is really clear. On making sure qualitative quantitative to inside and outside parties.

You measure and communicate risk status, problems, and progress using a chart like what I showed you don’t have to be technical to understand that chart. You just know whether you’re on or off plan. And when you’re off plan, why? I don’t have this resource.

I don’t have agreement from the operations team. Okay? That’s something that I can fix as the reader of that. Right?

You’re gonna show and tell a consistent story about materiality and reasonableness. Once you’ve defined what you mean by reasonable, and you’ve associated it with materiality. Did something break that matters to someone that has to get fixed, then you’re gonna be able to do everything you need to do to demonstrate consistency.

That is the presentation. I know we wanna be able to ask some questions. As we do that, I’m gonna leave my contact information up.

So people can reach out if if you have if you want any material from the deck.

Thanks, Chris. That was excellent.

Really, really informative.

If you’d like a copy of the slides just presented, you can download them from the drop down menu on the bottom left hand side of your screen.

Okay. Let’s see. We did have some questions.

The new SEC rules require that qualitative and quantitative factors are included in immateriality decision.

What are examples of qualitative and quantitative risk?

Okay. That’s good. So when people talk about quantitative risk, they generally are thinking about a financial impact. What is the probability that I will have this five million dollar event?

So a ten percent probability of a five million dollar event is a five hundred thousand dollar risk. That’s generally what people mean when they think about a quantitative risk. If if you’re running a factory and you’re you’re operating to quality standards, you might also have units.

In quantitative analysis.

But, people are generally predisposed to thinking about materiality for SCC in terms of dollars only.

The SCC is saying quantitative as well, and they give some examples. They talk about reputation and and things like this. But But what what’s what’s important to understand, and if if you when you get that deck, you look at the kinds of qualitative things there there are things like, you know, the ability for someone to be able to participate in, I either in stock offerings or in the marketplace, the ability for someone, you know, for discrimination and and nondiscrimination.

Our people actually put at threat at risk.

There are these non quantitative things that regulators in that care about more and more, and they were really specific to say we are not looking at quantitative only. We’re looking at qualitative impacts only. So those are some that come up.

I can send if someone wants to send me their information, I can actually send you a list of things that the Federal Trade Commission has shown as qualitative cybersecurity impacts that might also be helpful.

Great.

The next question is about reasonableness, and I think I think you you’ve addressed that pretty well.

What does the SEC mean by a reasonable investor?

Yeah. Oh, yes. A reasonable investor is different from reasonable cyber security. Yeah. So, yeah. So it is interesting. What would so when when when they talk about the reasonable investor, there there’s this concept of would a person would a person engaging in a vote or an investment?

Act this way with with with the information that this person’s been given. So the idea of reasonable cybersecurity is that balancing test that I told you. A reasonable investor, and you’ll have to go deeper in in into the literature here, but the reasonable investor is an individual who Similarly, situated to any other investor who would vote or who would, investor divest.

What would they consider something interesting enough to make that decision?

That that’s a little vague, but if you have a clear definition of what that is, So for instance, if you know what causes your people to vote and not vote, your investors to vote and not vote, if you know what causes people to show up to earnings calls, If you know what’s interesting in your industry, you might use that data to say, we believe this is what the reasonable investor is because we know this is how these are the decision points people make to investor divest. So that that’s something a little different from the cybersecurity stuff, but it’s helpful to be prepared for. Got it.

I think we have time for one more Why can’t we just say that a cyber breach materiality is the same as financial reporting materiality?

Yeah. Perfect. Right. Because it has to be qualitative and quantitative. Right? So this this is very interesting. So if, they they they give some interesting examples in in the commentary, but the idea is that we might have reason to think governance is broken, and we had evidence that governance broke, and the impact was small.

And we might say that is material. I’m trying to remember whether the example comes from this final ruler somewhere else, but this is a good one.

An organization’s financial materiality, let’s say, is over five hundred thousand dollars. Right? If they have a if they if they have a variance of over five hundred thousand dollars in their financial report, and they would say that’s material.

Someone found out that someone in finance, created a vendor and an invoice for her husband’s company without a supervision, violating all sorts of safeguards, and sent out an invoice for a hundred thousand dollars.

Then then you’d say, okay. That wasn’t material financially, but there’s something wrong with governance if my controller was to do there. Someone on the finance team was able to do that. That’s actually, that’s actually a problem.

That that that investors need to know because it says something about governance even if it doesn’t say anything about a dollar limit that was crossed.

So I hope that helps.

That’s a great example. Yeah. Very clear.

With that, I think we better move along, and say thank you very much, Chris. For the presentation today, and also thank you to HALOCK for making the webcast possible.

To obtain your CPE credit for the presentation, please disable your pop up blockers And then after the webcast closes, the exam will be presented in a separate window. If you have trouble viewing the CPE test or receiving CPE credit, send an email to webcasts at complianceweek dot com. Compliance Week is now offering training courses designed for compliance risk and audit professionals that offer the chance to enhance one’s expertise in key compliance areas while earning continuing legal education and continuing professional education credits.

This webcast has been recorded and will be available later today to compliance week members on our website under the web webcast tab.

And we want you to consider becoming a member of compliance week. If you’d like to learn more about becoming a member, please contact us at info at complianceweek dot com.

For today only, we invite you to use code webcasts three sixty webcast three sixty five to receive a membership for just one dollar a day So please go to complianceweek dot com backslash membership to learn more.

This concludes the webcast Thank you again very much for joining us today. Goodbye.