Expert Opinion: Why data resilience needs a reality check

Expert Opinion: Why data resilience needs a reality check

Dave Russell, Senior Vice President, Head of Strategy at Veeam, discusses how global organisations seem to have let their guard down when it comes to cybersecurity and why data resilience standards are not keeping up with today’s technologies and applications.

For years, many organisations have been guilty of putting data resilience on the back burner. Over time, however, the rising tide of threat levels, regulations and best practices has lifted all boats. Resilience is now firmly on the radar.

Time for a rethink

Awareness is only half the battle; preparedness is another matter. Now that industry benchmarks have improved so that organisations have a better idea of what to look for, they are waking up to an uncomfortable fact, they aren’t as prepared as they ought to be. The Veeam report on data resilience among larger enterprises in collaboration with McKinsey found that key aspects of cyber resilience – even old-hat fundamentals like ‘People and Processes’- were regularly self-reported as significantly lacking.

How did we get here? And how can organisations shore up these shortcomings? For C-suite decision-makers, resilience perhaps isn’t the most exciting or business-critical concern. Historically, it was often lumped in with general cybersecurity and assumed that it was already in place. Unfortunately, like most contingencies, the true value of data resilience can’t be appreciated until things go wrong.

With law enforcement cracking down on some of the most prominent groups, including the likes of BlackCat and LockBit, there might have been an assumption that cyberattacks as a whole are trending down. But the reality couldn’t be further from the truth. In the last year alone, 69% of companies faced an attack at one point or another, yet 74% still fell short of data resilience best practices. The threat is only evolving, with smaller groups and so-called ‘lone wolf’ attackers stepping into the gap. And with a new subsection of attackers comes a new set of methods, with the faster data exfiltration attack methods on the rise.

The writing’s on the wall

The same Veeam report, in collaboration with McKinsey revealed that 74% of participating enterprises lacked the maturity needed to recover quickly and confidently from a disruption. While cyber resilience gaps are often a case of ‘not realising before it was too late’, in this case, many of these deficiencies were self-reported. But if organisations are aware, why haven’t they plugged these gaps?

For some, it could be down to the simple fact that they’ve only just realised. The recent wave of EU-focused regulations, including notably NIS2 and DORA, has spotlighted the issue by requiring organisations to up their resilience across the board. In the build-up to their compliance deadlines over the last year, organisations had to critically assess their full data resilience, many for the first time, revealing a number of previously unknown blind spots.

But no matter how they realised their gaps, organisations did not fall behind overnight. For many, it’s happened incrementally with their data resilience standards not keeping up as new technologies and applications have been adopted. With most organisations implementing AI at will to stay ahead of the competition and optimise business processes, the impact on their data profiles has gone largely unnoticed. The sheer amount of data needed and generated by these applications has resulted in sprawling data profiles that fall far outside existing data resilience measures.

Taking a step in the right direction

The first step for any organisation with below-par data resilience should be to gather a clear picture of your data profile. What you have, where it’s stored and why you do or don’t need it. With this, you can reduce at least some of your data sprawl by filtering out any obsolete, redundant, or trivial data to focus on securing the data you need. Then, get to work securing it.

But the work doesn’t stop there. Once you’ve got your shiny, new data resilience measures in place, it’s time to stress test them. And not just once. Data resilience measures need to be consistently and comprehensively tested to push them to their very limits, much like in the real thing – cyber-attackers won’t just stop when your systems start to creak a little. And they won’t wait until the perfect time.

Go through scenarios where key stakeholders are on annual leave, or where security teams are occupied with something else entirely, to expose all of the potential gaps in your measures. It might seem excessive, but otherwise, the first you’ll hear about these vulnerabilities will be during or following a real attack.

It’s a significant piece of work to undertake, but data resilience is worth every penny. According to the Veeam report, in collaboration with McKinsey, companies with advanced data resilience capabilities have 10% higher annual revenue growth than those lagging.

That’s not to say that improved data resilience will magically boost these figures for you, but bringing up your data resilience standards will inevitably have a knock-on effect on processes across the board. At the very least, you can be sure that cyberthreats will only grow more complex, and that data footprints won’t be getting smaller any time soon. It’s an issue that every organisation will have to face, so jump in the deep end now before you get pushed beyond your limits by a cyber-attack.

Browse our latest issue

Intelligent CIO Middle East

View Magazine Archive