Weak links: Third-party vendors and cyberfraud
Matt Kelly is the editor of Radical Compliance, a blog and newsletter covering corporate risk, audit, and governance issues. In this podcast, Kelly talks to FM senior editor Drew Adamek about the cybersecurity risk that third-party vendors pose, how finance departments can start managing and mitigating that risk, and the mindset shift that finance departments need to make when thinking about protecting data.
What you’ll learn from this episode:
- How technology is making it harder for companies to identify their vendors.
- The cybersecurity risk posed by the complexity of vendor chains.
- Some best practices for mitigating third-party vendor cybersecurity risk.
- Developing effective strategies for managing data relationships with third-party vendors.
- How finance departments need to change their thinking about third-party vendor risk.
Play the episode below:
To comment on this podcast or to suggest an idea for another podcast, contact Drew Adamek, an FM magazine senior editor, at Andrew.Adamek@aicpa-cima.com.
Transcript:
Drew Adamek: Matt, thank you so much for joining us.
Matt Kelly: Happy to be here, Drew.
Adamek: How badly are companies exposed to third-party vendor cybersecurity risk?
Kelly: Badly, and the only way we might qualify that is to say there are various gradations of badly. You could be very badly exposed or slightly badly exposed. All are exposed, but who is managing this risk well, I think are very few companies. I have heard of some, but by and large, the numbers about how you might experience a loss of confidential data by means of some vendor of yours, it's alarmingly high.
Adamek: When we talk about cybersecurity risk, what exactly are we talking about? Are we talking data breaches, financial theft, is it the entire gamut? How would you describe the kinds of risk companies are facing?
Kelly: There are the more traditional cybersecurity risks where there is an outside party trying to force its way into your enterprise, and more specifically your extended enterprise with all of your vendors, all of your third parties, all of your contract employees working on premises next to your full-time employees, and all of this. You know, they're trying to get that data somehow. It may be in very many disparate places, hither and yon, and there may be some hackers who are trying to force their way through the firewall. That's the more traditional hacking that we've thought of.
But, you know, really I think probably the bigger risk is the companies don't understand their own cybersecurity practices where they either are very loose about those things and so they set that firewall, for lack of a better word, very low and it's easy to jump over, or more likely, they have not kept their head in the game about how they might be duped into giving the data up voluntarily through some insider threat.
And I don't even like the term threat because that implies the insider is in on it. Sometimes they are, but I think the vast majority of times it's more like, oops, we accidentally gave away data we shouldn't have; your own employees giving away your own data or a third party giving away your data by mistake or you giving away somebody else's data that you are momentarily in charge of. That happens, too. But those are the more fluid sort of cybersecurity risks that I think are the bigger struggle now.
Adamek: And this idea of a vendor. We traditionally think of someone that you buy a product from or sends you something tangible, but that definition needs to be expanded a bit to include services that you use that you might not consider a vendor; for example, Skype or LinkedIn even.
Kelly: Yep.
Adamek: How should companies be defining vendor and how wide of a net should they be casting when they think about their vendor chain?
Kelly: I mean if you want to be on the safe side, you cast it as widely as possible, and I know some companies where the compliance or audit executive has said, "I'm going to define a vendor as anybody we have cut a cheque to in the last 12 months." Sounds like a good idea except I know somebody who did do this at a large pharmaceutical and, of course, the accounts payable people returned a list of like 8 trillion vendors, you know, and so a lot of them are not necessarily high-risk. You know, your vendor who comes in and sets up the vending machine, in theory they're a third party but nobody is going to hack the vending machine. Really we need to think more about who is providing a service to us by which they wind up touching our confidential data and stop there. Anybody who gets paid to do something like that and as part of what they're doing they touch the confidential data, vendor risk period.
Adamek: But you're describing a really — an enormous world.
Kelly: Yes.
Adamek: Do companies have an understanding, as you see it, of who does have access to their data?
Kelly: No, they don't. So here are some of those statistics that I had mentioned before. These come from the Ponemon Institute, which did a study last September about data risks that companies might have through their vendors. So the average number of third parties with access to sensitive information — this was a survey of I think 600 or so corporate executives — among that population the average number of vendors accessing your confidential data was 583, and that was up from about 25% from the prior year, and the really unsettling news, for those of you who might be thinking, "Well, 583 sounds kind of low," you're right. For a large organisation it is not 583 — it is thousands.
I don't even know how many thousands, but large financial services firms, for example, easily thousands of vendors are touching their confidential data. Same study, different statistic; 57% cannot determine whether the vendor's safeguards will actually work as promised. And the worst statistic of all was that only 34% of this group even had a full inventory of all vendors with whom they share confidential data. We're not talking anything about what are their safeguards, have we tested them or anything else like that. Only 34% even know who their vendors are, where this is something we should think about.
Adamek: And I want to talk about the kind of data that's at risk, because my audience works in the finance department. They're dealing with financial information, but that's not the only thing at risk, is it? And how do you see the hierarchy of data at risk?
Kelly: Well, there's several different ways to look at it. You could certainly look at that through the hierarchy of what the regulators think is important, and probably number one would be private health care information, which has been regulated for more than 20 years now and clearly it is just a very sensitive, personal topic. So if you are losing medical testing results or other medical research about individuals, yeah, you're going to be in deep trouble. It's easy to say financial data except, you know, really — I don't necessarily know that there's a whole lot of value to the thieves if they know that there is this account with this much money in it. You know, they know that your inventory is worth $4 million.
Well, you can find that from an SEC filing. That's not a big deal. But any information that could then be resold, well, it will only have a resale value if there's an actual value to it, you can do something with it. So personal data is valuable because it can be used to impersonate others and then do something else with it, launch a credit card fraud or something like that. You know, you could even cobble together different types of data from different individual persons into a composite synthetic person, which is a real term in cybersecurity.
You can have a synthetic person who it's very difficult to track down any of their data, because any specific item about them is traceable to a real person, but a lot of it also is user IDs and passwords because they are literally the keys that unlock something else. In theory, if there are – anybody listening on the podcast who is like in government contracting, you know, you could have intellectual property, drug development formulas, weapon design systems, anything that might be for sale on the black market, that's going to be highly prized. I think most companies know the abstract concept that these are valuable types of data, but we struggle with how are we actually manipulating them. What are all the ways that we're touching the data? And, like, that's where the risk is. The risk isn't in the data. The risk is in how you are touching the data.
Adamek: So, for example, business email compromise.
Kelly: Yeah.
Adamek: The flowchart, the company org chart is very valuable to criminals in a case like that.
Kelly: It certainly would be.
Adamek: Yeah.
Kelly: You know, you could get the org chart and then start piecing together from the names on the org chart their LinkedIn profiles. You composite together a plausible phishing attack that this really does seem like it's coming from the HR executive when they're emailing the HR admin to say, “Please submit all the employees' W2 information”. Different pieces of information put together can then become more than the sum of the whole — the whole is more than the sum of the parts, and that's where it gets really difficult because basically you wind up saying any piece of information might be valuable in connection with any other so we have to protect it all, which is a tall order, but here we are.
Adamek: But you're talking about an operation of massive complexity for a multinational corporation. There could be, as you described, thousands in that — tens of thousands of third-party vendors. What's the mental shift that companies need to make, that finance departments need to make, when dealing with third-party vendors?
Kelly: I think partly it's beyond the finance department, partly those executives, employees in the first line of defence, in the operating units. They need to have a mental shift that just because using a cloud-based vendor for business processing, data storage, whatever, just because there is a compelling business case for it — and oftentimes there is and there's a lot of good sense in it and a lot of these cloud vendors are really good at what they do — you still need to understand the consequences of shifting your business process onto the cloud and doing that digital transformation. And as soon as you start to think about that really this is also more of a corporate governance issue.
The board and the C-suite should be thinking through all the way that eventually all of our business processes are going to be done on the cloud and we'll be using some other vendor to do them because they are more expert at it than the average organisation's business function, but you need to start thinking through that how do we conduct our business processes, once they have been digitally transformed, how do we conduct ourselves in a secure manner given all of the regulatory risks we have, the litigation risks we have, what our resources are. You know, we could certainly talk about protecting ourselves in the finance department about payments — and ultimately all of this is about stealing money.
Adamek: Once a company has made the decision to strengthen their third-party vendor cybersecurity risk, what's the approach in that relationship? How do you approach those third-party vendors and ask them to be more secure?
Kelly: Well, and not to be flip about it, but approach it clearly, specifically; we are worried about this. You will have to address this if you want to work with us. I think for many large vendors out there you're not going to be the first person to ask them. You're not going to be the millionth person to ask them. So they're prepared for that, but when you think about this as an exercise in risk management for your own company you will need capabilities at understanding what our risks actually are so that we can go to the vendor and tell them what we want them to do. Specifically, you could get what is known as a SOC 2 audit of a vendor's cybersecurity controls, and the AICPA has five different trust service principles that you would consider as part of the scoping your SOC 2 audit, but not all five have to be in every SOC audit because not every vendor you have is going to deal with all of them.
If you don't collect personally identifiable information or personal health information you don't have any privacy risks, so there's no need to include privacy in scope of a SOC 2 audit for a data storage vendor if you don't have that, but let's say you don't have that today and then life goes on and next year somebody in your marketing department decides we're going to launch a new product and we're going to start a new campaign where we are going to start collecting personally identifiable information.
Do you in the finance function or in the audit function or in the compliance function, do you know that marketing suggested that? Did they tell you that? And if they did, did you understand then we have to retailor our SOC audit and put privacy back in because now we've changed?
It gets difficult because life online is so fast and so fluid and so easy that the marketing department could do this, they could have the best of intentions, it might be a great marketing idea, and then four months later they're going to realise, oh, we've been collecting personal information about minors and we forgot to tell compliance and we don't have a mechanism to get consent. You're up to your eyeballs in regulatory and reputation risk now, and it doesn't matter that it was a great idea then. It's a terrible idea in hindsight. So how do you build those good data hygiene and good vendor risk management practices, you know, into what your organisation does? And I'm not making any illusions about it; if you're a large organisation that can be hard.
Adamek: When you ask your vendors to do an audit, what are the standards you should be looking at?
Kelly: You definitely want to use a SOC 2 type 2 audit. So a type 1 audit will assess the design of the internal control — is it designed properly for the objective at hand at this point in time we're looking at it? OK, that's fine. That's a nice preliminary thing for a level 1 audit. A level 2 audit will take it further — does this control actually work as designed for a prolonged period of time and fulfil the objective?
Unfortunately, it is not news that sometimes vendors say they have controls and they don't or they think they have controls but they're misconfigured or they think it's all great and the control doesn't actually work. So a SOC 2 type 2 audit done by a reputable and competent audit firm that works on these sorts of things, that's what you want.
The SOC 2 audits are specific to you and the vendor you're looking at. So there might be vendors out there who say, "Oh yeah, we're SOC 2 compliant." Well, no you're not because a SOC audit only exists for a specific relationship that is being evaluated. There is a thing called a type 3 audit but that is — you know, it's a somewhat serious effort to — it shows the vendor is thinking about these issues, but it's not much more than a Good Housekeeping Seal of Approval you can put on your website to prospective customers know you've thought about it. I've thought about being an astronaut. I am not actually going to go ahead and be one. So you still — you, the company, you will need to push them to get more of a SOC 2 audit, ideally the level 2 type.
Adamek: You've asked a third-party vendor for a SOC 2 audit.
Kelly: Yeah.
Adamek: Who pays for it?
Kelly: Whoever you get to pay for it. You can generally — I think, you know, you can get them to pay for it in one way or another if you want that badly enough, or like basically tell them, "If you want my business badly enough you're going to have to figure out how to get this done." Maybe they pay for it through a separate cheque, maybe they pay for it in terms of charging you a lower fee, but you get it with a certified or a somehow authorised or reputable SOC auditor — and they're out there. You know, how you pay for it can be negotiated as all things monetary, but the bigger issue for you the company is you have to know what you want them, the audit firm doing the SOC audit, you have to know what you want them to do.
Adamek: This is another circle-back kind of question, but I thought it was keenly important to the way that we're going to do business, all of us, in the next 10, 15, 20 years. Is easier business safer business?
Kelly: No.
That's the simple answer to it. The difficulty with the rise of cloud-based services is that really any person with a Google search and a payment card and a dream is going to be able to find a vendor to help him or her do whatever it is. So it's easier to onboard a vendor, but I did not say it's easier to onboard them well. It's easier to onboard them poorly. It is easier to evade corporate internal controls about this. So you the company have to think not so much about hard controls blocking you from searching Google as softer cultural control environment controls about getting employees to understand that's a bad idea to hire Joe's Fly-By-Night Data Processing.com for $2 a month.
What we're really getting at here with the rise of cloud computing is that employees have more freedom to act, but it is more freedom to act recklessly. They can also have more freedom to act smartly, but you're going to have to coach them on the difference between smartly and recklessly.
Adamek: And that gets to my final question. Cybersecurity risk, whether it comes internally or through your third-party vendors, is it a technology issue or is it a person issue?
Kelly: It's absolutely a person issue. Solving it will require the IT security team that you have to help you — the auditor, the finance director, the compliance executive, whoever — probably ideally all of them working as a team — will sit down and say, "Here are the risks that we are creating by doing business in this new manner. Here's what we will need to do for better security protocols, and IT security now please help us figure out how to make that work through your end."
But like I said, this is a people problem because if you come up with really tough IT controls all the employees are going to do is view that as an obstacle to doing their real job and then they will spend all of their time trying to figure out how do I evade the control the company has put in front of me so I can get on with my real job?
So number one, they probably will find out that evasive way to do it, and then number two, you would ideally like them to spend all their time thinking about how to do their job, not how to avoid the internal control you put in their place. So there are plenty of ways you could be draconian and jerk-like in your IT governance policies, but you're going to create more enemies than friends with that and you are going to need employees and third-party vendors to be your friends and all work together to figure out good data security, because, man, these hackers are really smart and devious in how they're going to come at you.
Adamek: Matt, thank you so much for joining us.
Kelly: Pleasure to be here, Drew.