“There are only two types of data that exist in your organization: data that someone wants to steal and everything else.”
- Forrester Research
But, when looking to better secure your organization’s data security posture, it is good to start with what has changed. In the report, Forrester concluded that the old network security model was that of “an M&M, with a hard crunchy outside and a soft chewy center.” It is the idea of the hardened perimeter around the traditional, trusted datacenter. This old model is fraught with vulnerabilities as the traditional model is not equipped to handle new attack vectors with IoT, workforce mobility, and data centers moving to the cloud. It is increasingly becoming outmoded and weak.
In it’s place must come a data security model that takes into account the current network landscape and its vulnerabilities. Enter, Zero Trust. It builds upon the notion of network segmentation and offers key updates all under the banner: "never trust, always verify."
Below are the three main concepts to Zero Trust. Follow along as we break down the trusted/untrusted network model and in its place rebuild a new trust model.
The first rule of “never trust, always verify” is that all traffic within the network should be considered a potential threat until you have verified “that the traffic is authorized … and secured.” Let’s look at these two components:
The only way to minimize the risk of employees, contractors, or external bad actors misusing data is to limit the access each user/role is given to the least amount of privileges possible. With this, it is a forgone conclusion that all sensitive data is already encrypted and minimal privileges are given as to who can decrypt it. We implement a minimal privileges policy so that “by default we help eliminate the human temptation for people to access restricted resources” and the ability for hackers to access a user’s login credentials and thereby have access to the entire network.
Role-based access control (RBAC) model, first formalized by David Ferraiolo and Richard Kuhn in 1992 and then updated under a more unified approach by Ravi Sandhu, David Ferraiolo, and Richard Kuhn in 2000 is the standard today. It’s ability to restrict system access only to authorized roles/users makes it the ideal candidate for implementing this leg of Zero Trust. While Zero Trust does not explicitly endorse RBAC, it is best game in town, as of today. For a deeper dive, visit NIST’s PDF of the model.
Once we have authenticated each user and restricted them to the least amount of data possible to adequately do their job, the last thing to do is “verify that they are doing the right thing” through logging and inspection.
Here is a short (and certainly not exhaustive) list of techniques used to inspect all events happening in your network.
Note: There are many tools available that accomplish these. Please refer to Gartner’s Security Information Event Management (SIEM) Magic Quadrant to find the tools that may interest you.
It's not a question of if, but when, a data breach will happen. Hackers grow more sophisticated in their attacks and threaten everything from intellectual property to financial information to your customers Personally Identifiable Information (PII). The old model of the high, guarded perimeter with the trusted, internal network no longer functions as a secure model. Zero Trust offers a more comprehensive approach to today’s data security needs. As you look to deploy this model, begin to seek out tools that will help you. Here is a short list of some of the tools to consider:
In many cases, adopting this approach will not be about bolting on a few products onto your existing data security framework but completely renovating it. Don’t let expediency force you to defend your data with only half measures. Take a deep dive into Zero Trust’s approach and see where you may be vulnerable.