By: Paul German, CEO at Certes Networks
Attention all cybersecurity professionals!
We all know that today’s cybersecurity landscape is an ever-changing one. So how often should organizations review their cybersecurity strategy? If it’s a question that hasn’t been asked in a while, chances are that in this world of constant threat, you’re probably at risk.
For despite the near constant stream of data breaches making headlines, far too many organizations insist their current cybersecurity model is good enough. However, the contrary holds true. Quite simply, if any of the statements below apply to your business, then it’s arguable that cybersecurity confidence is actually misplaced complacency.
We haven’t been hacked before, and I know where my organization’s critical or sensitive data is at all times. Why change something that’s working today?
No business can ever be 100% sure where its data is or that it hasn’t been compromised in transit. Failure to recognize this issue is a board-level responsibility.
We tick the boxes when it comes to GDPR, PCI DSS, HIPAA (and other regulations) so my organization is secure. No company that has met their compliance requirements has ever been hacked, right?
Taking a compliance-led approach to securing customer data will cause a fundamental vulnerability within the cybersecurity infrastructure, simply waiting for hackers to exploit. Compliance is important, clearly, but it should be a subset of the overall, continuously evolving security strategy, rather than an end-point goal in itself. Organizations are understandably concerned about the financial penalties associated with failing to achieve regulatory compliance. But take a step back and consider the financial implications of a data breach, of high profile customer data compromise. That is a far more significant cost and an event that will have long-term repercussions on customer perception and loyalty.
We trust that our WAN provider has the necessary controls in place to keep our data secure as it moves between locations.
WAN providers can’t guarantee the security of their environments, and the security of your data is ultimately your responsibility. What’s needed is a data-first ‘Zero Trust’ mindset that protects data before sending it to the carrier network.
IT costs need to be reduced, so the easiest thing is to cut the security budget; it reduces cost without reducing functionality. But, just in case, we’ve increased our cyber insurance coverage.
Cybersecurity insurance policies require customer diligence. You cannot buy a security policy, not deploy security and then expect a post-hack payout. More significantly, think about the cost and loss of earnings associated with the fallout of a data breach…
My network is secure so I don’t need to secure our data in motion. After all, we own the entire infrastructure end to end, wherever our data goes.
When 70% of all breaches are as a result of internal user compromise, this is a false sense of security. Not only are current security models broken, current trust models are too so they must be realigned and rebuilt. The only way to do that is to change the emphasis. Shift the security focus from infrastructure to the user or application and it doesn’t matter how complex technology has become, or becomes in the future; the security model remains simple and hence both manageable and relevant. Moreover, whether the environment is owned by the business, third party, or in the cloud, when access is based on users and application, only a user with cryptographic keys and credentials gains access. It is that simple.
We need not worry: we can do encryption on our firewall, switches and routers for less money and achieve the same result.
Turning on encryption in a network device WILL degrade the performance, typically by 50%. The reason lies in the way encryption has been deployed to date. In order to address the continued friction between operational goals and security imperatives, organizations need to decouple encryption from the infrastructure completely and instead overlay the security measures, leaving the underlying infrastructure untouched. The answer is Layer 4 encryption.
Layer 4 encryption is dedicated to providing the level of trust of data in motion and applications moving across the infrastructure, yet avoids any impact on network performance and complexity. Furthermore, Layer 4 encryption operates in ‘stealth’ mode: it is only the data payload that is encrypted – not the entire network data packet. All of the complex management and maintenance problems created by traditional encryption deployment are removed. The data in motion is secure without adding complexity or compromising the operational performance of the infrastructure.
We don’t need encryption because our firewalls will keep the hackers out, or if not our intrusion detection will let us know immediately so we can stop a breach while it’s happening.
The current security mindset must move away from outdated thinking about securing the perimeter, assuming that breaches can be ‘protected’ against, ‘detected’, and ‘reacted’ to. With the average time to detection being 120 to 150 days, depending on the source, this clearly is a fallacy. When it comes to data breaches, it is a case of ‘when’ not ‘if’, so organizations must think about how they can best ‘contain’ a hacker from wreaking havoc on their data by adopting a software-defined approach to security and leaving the infrastructure-based security mindset behind.
Data compromise is something that happens to other businesses, not ours!
That’s what all the brands that have been in the headlines over the past 18 months thought as well.
The game has changed; it’s no longer about the high profile, kudos-winning breaches. Today’s hacking community is far more focused on the theft of sensitive, critical customer data that will leave those affected with long-term repercussions. Cybersecurity must be a process of continual evolution: just because you feel protected today doesn’t mean you will be tomorrow.