The age-old question: how much is enough? Executives at companies large and small ponder this question every day as news of the latest cyber-attack fills network and social media resources. So how does a company ultimately decide how much to spend on protecting itself from cyber-attacks? When it comes to cybersecurity there is no simple answer, and the ultimate decision depends on several key factors.
Different industries have varying levels of requirements for protecting critical data. For example, healthcare has HIPPA as a compliance standard, while one of the newer standards NIST 800-171 regulates cyber security requirements for any company that does business directly or indirectly with the Department of Defense. In instances where regulatory standards are involved, there are requirements that must be met so investment decisions typically focus on minimizing the resources necessary to meet the standard. This minimalist approach ensures compliance requirements are met but doesn’t necessarily reduce risk. In non-regulated industries, the amount spent will vary based on intangible items like the importance of avoiding disruption (manufacturing), reputational value (finance), and data restoration cost.
Another important factor on the spend side of cyber security is the depth of expertise and available resources in an organization. These components will vary largely based on the size and industry affiliation of any given firm. Larger organizations with internal resources maintain employee fixed costs that are easy to define, while smaller firms without internal resources will assume variable costs as external consultants are contracted to assist. Understanding what resources are necessary to support an effective risk management program is dependent on an organization’s ability to quantify the current risk profile. A holistic risk assessment that uses an industry standard framework like NIST or CIS provides the insights and detail necessary to develop a sustainable resource plan.
One of the most relevant pieces when evaluating how much to spend on risk management is the risk tolerance of an organization. Unfortunately, this component can be the most challenging to quantify. How does an organization determine what is a reasonable risk tolerance? It really comes down to how much recovery cost and reputational damage seem reasonable for the business to assume under a worst-case scenario. Cyber insurance can help mitigate the cost side of recovery, but it does nothing to help with reputational damage that results from a data breach. Evaluating ongoing investment levels as compared to the average cost of a data breach, which was $4.24 million in 2021, adds important context to the decision-making process.
While there is no industry standard investment requirements for cyber security, the revenue and industry focus for an organization will ultimately factor into the equation. As a rule, organizations should plan to spend between 1% and 1.5% of annual revenue on cyber-security resources (people and technology). For industries that have regulatory requirements (HIPAA, PCI, etc.), spending will be higher due to the emphasis on recurring evaluation and enforcement necessary to meet the standards. Just remember: it’s not a question of if, but a question of when a company will fall victim to a cyber-attack. Organizations that plan and invest in prevention will ultimately minimize impact.