The US National Institute of Standards and Technology’s (NIST) recent Special Publication (SP 800-207) has changed the table stakes when it comes to cybersecurity best practice. While not mandatory, the federal agency’s role in enhancing economic security cannot be under-estimated. As such, its guidance on refining the concept of Zero Trust and its high-level roadmap on how organisations can implement a standardised approach to a Zero Trust Architecture can also not be ignored.
Paul German, CEO, Certes Networks, outlines why adopting a Zero Trust mindset to data and cyber security is now an operational necessity, and explores the essential components organisations must embrace to achieve this.
Zero Trust
The concept of ‘zero trust’ is not new; originally defined in Stephen Paul Marsh’s doctoral thesis on computational security in 1994, it became a key cybersecurity concept when Forrester’s John Kindervag reignited it in the late 2000’s. The idea being that would-be attacks could come from both within, as well as from without, an organisation’s network.
However, until recently, the debate around zero trust has remained – in my view – focused solely on authenticating the user within the system rather than taking a more holistic approach and looking at user authentication and access to sensitive data using protected micro-segments. This concept has changed with NIST’s Special Publication; no longer is the network the focus of zero trust, finally it is the data that traverses the network.
At its core, NIST’s Special Publication decouples data security from the network. Its key tenets of policy definition and dynamic policy enforcement, micro-segmentation and observability offer a new standard of Zero Trust Architecture (ZTA) for which today’s enterprise is responsible.
Dynamic Policy aligned to Business Intent
As data owners, organisations are responsible for protecting their sensitive information. Moreover, with increasing regulation that specifically targets the protection of this sensitive data, it is more important than ever that organisations adopt a cybersecurity stance that can ensure – and maintain – compliance, or information assurance. However, not all data has the same level of sensitivity.
Under the latest zero trust standards, data needs to be classified according to differing levels of sensitivity and the business intent of that data. This business intent needs to define an organisation’s operational policy around how data is handled and accessed, when, where and by whom, with micro-segmentation protecting each data class from external compromise and providing isolation from other data classifications.
In addition, enterprises are encouraged to observe and collect as much information as possible about their asset security posture, network traffic and access requests; process that data; and use any insight gained to dynamically improve policy creation and enforcement.
Authentication and Authorisation
Under NIST’s zero trust standards, access to individual enterprise resources is granted on a per-session basis based on a combination of component relationships, such as the observable state of client identity, application/service, and the requesting asset—and may include other behavioural and environmental attributes – with operational policy enforcement.
Authentication and authorisation to one resource does not grant access to another resource. It is also dynamic, requiring a constant cycle of obtaining access, scanning and assessing threats, adapting, and continually re-evaluating trust in ongoing communication.
Cyber security best practice demands that, by creating fine-grain policies, authentication and authorisation are done on a ‘per-packet’ basis, only allowing access to the resources required. Layer-4 encryption protects data as it transits between policy enforcement points, while providing full observability by encrypting the payload only, leaving the packet header in the clear, also allowing for granular enforcement of security policies.
Network visibility and observability tools are the linchpins that provide real-time contextual meta-data enabling rapid detection of out-of-policy data and fast response and remediation to any non-compliant traffic flow or policy change to maintain the required security posture on a continuous basis.
No Compromise
Fundamentally, a Zero Trust posture must be achievable without compromising the performance of the network, allowing users with authenticated and authorised access to the data they need to do their jobs seamlessly.
Organizations need to be able to secure data in transit, across any network, with zero impact to performance, scalability or operational visibility. As the latest NIST zero trust standards advocate, decoupling security from network hardware in this way is a unique approach and enables security teams to be confident that their organisation’s data is assured, regardless of what is happening to the network – finally putting the focus for cyber security best practice where it belongs – the data.
Jesse Pitts has been with the Global Banking & Finance Review since 2016, serving in various capacities, including Graphic Designer, Content Publisher, and Editorial Assistant. As the sole graphic designer for the company, Jesse plays a crucial role in shaping the visual identity of Global Banking & Finance Review. Additionally, Jesse manages the publishing of content across multiple platforms, including Global Banking & Finance Review, Asset Digest, Biz Dispatch, Blockchain Tribune, Business Express, Brands Journal, Companies Digest, Economy Standard, Entrepreneur Tribune, Finance Digest, Fintech Herald, Global Islamic Finance Magazine, International Releases, Online World News, Luxury Adviser, Palmbay Herald, Startup Observer, Technology Dispatch, Trading Herald, and Wealth Tribune.