Nick Ellsmore, Senior Vice President of Worldwide Consulting, Trustwave SpiderLabs
For years, we considered data as ‘the new oil’ – a coveted resource that can power the modern world. However, organizations are now realizing that data stockpiles are a lot less like oil than they are like uranium: hard to extract value from, risky and expensive to store, and a significant target for malicious actors.
The global big data analytics market is estimated to reach $68.09 billion by 2025, and the post-pandemic landscape increased the rate of data breaches by more than 400%, so it’s more important than ever to evaluate data storage requirements. Part of that process is understanding that the cost-benefit of storing data has changed. Even if organizations want to maintain access to large data stockpiles, they shouldn’t necessarily be storing them themselves.
While businesses may place great value on the data they collect about their customers, it’s worth asking whether they really need it.
Reducing the amount of sensitive data organizations keep
According to recent estimates, malicious entities can purchase databases with millions of stolen emails, compromised online banking logins, and verified Stripe and PayPal accounts ready to be exploited for as little as $100. The risk – and potential financial exposure – is significant for the affected individuals and organization alike.
In this landscape, the best security project almost any business can undertake is to reduce the amount of sensitive data they keep. Being intentional about where the data is stored and how it is protected is similarly critical. The price organizations will pay for a breached driver’s license number is the same whether they’re stolen from a lost USB drive, a spreadsheet emailed out in error, or a vulnerable web application.
Conducting regular reviews of your organization’s data governance will help provide a clear understanding of your complete data footprint, including what you hold, whether you should be holding it, where is it held, and how it is protected. Using technology to verify the integrity of controls, and to verify the accuracy of information asset registers, is an important check and balance.
Making use of secure third-party data storage
Putting data stockpiles under lock and key isn’t enough. Too much risk is involved with concentrating data in one place, unless your core business is indeed data storage.
This concept – of data storage being a core business – is likely to lead to the emergence of a small number of third-party service providers that are tasked with holding, managing, and protecting data using secure processes such as tokenization. This removes the need for businesses to guard sensitive data themselves. The rapid increase in cloud data storage and hosting availability mitigates the need for organizations to develop large, complex, and often costly self-hosted information technology systems.
Still, your organization should be asking what the bare minimum of data is that you need to hold onto. This will not only help to avoid paying for more storage space than you need, but also reduce your risk because no data storage system is entirely bulletproof. Economies of scale ensure third-party organizations protect data effectively and conduct rigorous audits with assurance regimes that provide checks and balances, but third-party risks also need to be assessed thoroughly.
To ensure your organization is protected from the loss of data stored on a third-party system, you should confirm that your cyber insurance policy covers against third-party liability costs. A policy should contain language that would protect the insured from harm caused by the acts, errors, or omissions of third-party subcontractors, vendors, and cloud providers.
The need for government and industry collaboration
For the future of data storage security, governments around the world need to work with industry to enforce a baseline for businesses to effectively manage their data and keep it safe. Collaboration between the federal government and industry should implement a PCI DSS-like security standard that can be applied to all sensitive data. From there, a government-endorsed marketplace of certified providers should also be developed to hold that sensitive data, regulate intermediate access, and provide audit and assurance services.
Even mismanagement of data destruction can lead to a costly breach that can damage an organization’s reputation, necessitating standardized electronic media destruction protocols. Every direction you turn, additional data security requirements arise.
Today’s data explosion poses challenges, but also opportunities to expand our regulatory clarity and have productive dialogue to fine-tune security measures. As the volume of data continues to grow exponentially, we must understand it as a resource with a complex relationship to time. While the value of data may degrade with time, the risk of that data may not degrade at the same rate.
Companies that put solutions and processes in place to manage and govern their data stockpiles proactively will benefit just as much from data as they do from getting rid of it.
Nick Ellsmore is Senior Vice President of Worldwide Consulting and Professional Services at Trustwave, leading the global team providing strategic advisory services to clients along with the elite SpiderLabs penetration testing and incident response teams.