+1.800.357.1019

+1.800.357.1019

Feel free to call us toll free at +1.800.357.1019.

If you are in the area you can reach us at +1.360.359.4400.

Standard support
6:30am - 4:00pm PST, Monday - Friday, Free

Premium support
If you own Townsend Security 24x7 support and
have a production down issue outside normal
business hours, please call +1.800.349.0711
and the on-call person will be notified.

International customers, please dial +1.757.278.1926.

Townsend Security Data Privacy Blog

Patrick Townsend

Recent Posts

The Cloud and Encryption Key Custody

Posted by Patrick Townsend on Aug 1, 2017 11:32:25 AM

You should be concerned about storing encryption keys in the cloud, but probably not for the reason you think.

Webinar: Securing Data in the Cloud with Encryption Key ManagementOne of the most common questions I get about cloud encryption key management is “Who has access to my encryption keys?” As customers migrate to Microsoft Azure and Amazon Web Services (AWS), it is really good to understand the policy implications of cloud service provider encryption key management services. And it is not a topic that cloud service providers like to discuss very much.

The truth is that common key management services such as Microsoft Azure Key Vault and Amazon Web Services Key Management Service (KMS) are under the control and management of Microsoft and Amazon. The user interfaces and APIs available to customers on these cloud platforms are easy to use and very inexpensive, or even free. So they are very attractive to new cloud customers.

So what is the problem?

First, I don’t feel there is a problem with the security implementation of key management services by these cloud service providers. They have great security teams and I believe they take care in both the implementation of the security systems as well as in the hiring and management of the teams that support the key management systems. And you can’t really argue with the cost model of key management services. Cheap or free is always attractive.

The problem originates in the fact that the cloud service provider creates, manages, and owns the actual encryption keys that protect your data. This means that they are subject to law enforcement warrants and national security letter requirements to surrender your data and encryption keys. And you may not be notified of these actions. This is not an issue just in the United States. Many national governments have various legal rules that require cooperation by cloud service providers. Many cloud companies try to be transparent about these law enforcement activities, but transparency can be blocked in many cases.

Should cloud service providers refuse to obey lawful requests for information about their customers? Of course not. We all live in a nexus of laws and regulations that are largely designed to protect us. If a law enforcement warrant is lawfully obtained a cloud service provider would be acting responsibly by complying with a request for copies of your data and your encryption keys. And they may not be able to inform you of that action.

And there is the problem. You might not know what is happening to your information stored in the cloud.

Any responsible executive team in a business or organization would want to know if there was a potential problem with an employee, group of employees, company policy, or operation in a local, federal or international environment. Executives want to be aware of potential problems and respond to them as quickly as possible. In fact, this is a core governance requirement. And they can’t act quickly and responsibly when they are not aware of a problem. And that’s the rub. If you give your cloud service provider access to your encryption keys you may lose the ability to know when a problem arises.

Is there a solution to this problem?

By deploying a third-party encryption key management solution in the cloud or on-premise in your own data center you retain exclusive ownership to the encryption keys and data they protect. Cloud service providers cannot respond to law enforcement and intelligence service actions because they have no administrative access to the encryption keys. This doesn’t mean that law enforcement and intelligence services won’t be interested in obtaining your information. But it does mean that they will have to notify you of their desire to obtain your data. Of course, as a responsible business you will want to comply with these requests. But you will do so with full knowledge of the activity and will the full advice of your own legal counsel. And the process will probably provide you with some clues as to the reason for the action. That’s something you will really want to know.

Retaining custody of your encryption keys means retaining control of your organization’s governance and risk-management controls. And that’s a good thing.

Knowing is better than not knowing.

Securing Data in the Cloud

 

Topics: Encryption Key Management

IBM i Privileged Users – A Unique Security Challenge

Posted by Patrick Townsend on Jun 27, 2017 8:54:41 AM

If you are an IBM i security administrator you know how hard it can be to determine a user’s true level of privilege on your system. IBM has given us a very flexible scheme to grant and restrict privileges to groups of users. And this flexibility can lead to unexpected security exposures. Let’s delve into this a bit deeper with an example (names are made up for this example):

JANICE
Janice is regional manager in the sales team. She’s exceptionally effective at her job and has taken on a number of tasks that help her support her team and the sales goals of her region. Let’s take a look at her user profile:

User Profile . . . . . . . . . . . . . . . . . . . : JANICE
Special authority . . . . . . . . . . . . . . . : *SPLCTL
   
Group profile . . . . . . . . . . . . . . . . . . : SALES
Supplemental groups . . . . . . . . . . . : HRUSER PAYROLL REPORTING
  INVENTORY MANAGERS …

 

Identify Escalated Privilege Attacks on IBM iAt first glance it would seem that Janice has a normal user level of special authorities. In fact the only special authority is spool file control (*SPLCTL) which would be reasonable for a manager who needs to run and print reports. It also seems appropriate that Janice has a Group Profile of SALES. You would imagine that this probably gives her the ability to access the company sales management application.

The first hint of concern is the long list of supplemental groups. If you’ve met effective managers like Janice it won’t surprise you that they have access to a number of applications. She probably has responsibility for approving time off for her department’s employees, and has responsibilities for reporting to management. But what privileges are hidden in that Group Profile and in those Supplemental Groups?

Let’s take a look. 

SALES (Group profile)
When we display the SALES user profile we find these special authorities: 

User Profile . . . . . . . . . . . . . . . . . . . : SALES
Special authority . . . . . . . . . . . . . . . : *SPLCTL
  *JOBCTL
   
Group profile . . . . . . . . . . . . . . . . . . : *NONE
Supplemental groups . . . . . . . . . . . :  

 

Janice already had authority to spool files, but notice the job control value of *JOBCTL. This means that Janice has now inherited additional authority to manage jobs. This is not a severe uplift in privileges, but it shows how privilege escalation works.

Now, what about those supplemental groups? Do we have to look at every one?

Yes we do. Let’s look at the HRUSER profile next

HRUSER (Supplemental Group)
When we display the HRUSER user profile we see these authorities: 

User Profile . . . . . . . . . . . . . . . .  : HRUSER
Special authority . . . . . . . . . . . . : *SPLCTL
  *JOBCTL
  *SECADM
   
Group profile . . . . . . . . . . . . . . . : *NONE
Supplemental groups . . . . . . . . :  

 

Wow, the HRUSER has the special authority of security administration (*SECADM). That’s a bit worrying. If we had to guess there is probably third party HR package requirement for this, or this authority was just granted out of convenience. But now Janice has much more authority. 

Let’s continue our exploration of those supplemental group profiles:

PAYROLL (Supplemental Group)
Let’s take a look at the PAYROLL user profile:

User Profile . . . . . . . . . . . . . . . . : PAYROLL
Special authority . . . . . . . . . . . . : *SPLCTL
  *JOBCTL
  *ALLOBJ
   
Group profile . . . . . . . . . . . . . . . . : *NONE
Supplemental groups . . . . . . . . . :  

 

Whoops, the PAYROLL user has All Object authority (*ALLOBJ). Bingo! This is the mother load of privilege. A user with All Object authority basically has the keys to the kingdom. It is pretty much equivalent to being the QSECOFR security officer (“root” for you Linux nerds). Once you have All Object authority you can manage other user profiles, grant yourself additional authority, and basically access any data on the IBM i server. 

If I am an attacker and I can steal Janice’s credentials for the IBM i server I now have all of the authority I need to infiltrate sensitive data.

Did you notice how much work it was to track down Janice’s true privilege level? As an IBM i security administrator you probably know how to fix this problem. You need to analyze the real need for the All Object authority and revoke it. But imagine that you managed a system with hundreds or thousands of users. And imagine if you needed to check this at least monthly in order to detect any changes since the last time you inspected your users? It would truly be impossible to keep up with this task, and as the security administrator you might have other things you need to do, right?

So, is there any hope?

 Sure there is. Our Alliance LogAgent solution will do this work for you. You can run the User Authorization report and Alliance LogAgent will track down these authorities for you. It will tell you the overall inherited authority of any (or all) users, and where they are getting the authority. Here is an example of the output for Janice:

escalated-privilege-report.png 

Notice that all of Janice’s cumulative authorities are listed right on the top line of the report detail. Then notice that the Group Profile and all Supplemental Group profiles are listed with their authorities. The PAYROLL user is clearly identified as having the All Object authority. Now you can go to work. 

The Alliance LogAgent report can be executed for all users, or for a group of users. And you can filter it so that you first get a list of all users who have inherited All Object authority. Then run it with additional authorities. In a few seconds you can find your privileged users, discover where they get that authority, and create a work plan to fix the problems.

However, Alliance LogAgent goes even further. As it is processing events from the security journal QAUDJRN, it can resolve in real time the true privilege of each user signing on to the IBM i server, tag job start events where the user has elevated privileges, and send them to your SIEM for monitoring. In real time.

I think that’s pretty powerful, don’t you?

Patrick

I

Topics: IBM i, Alliance LogAgent

Who owns my encryption key in the Amazon AWS cloud?

Posted by Patrick Townsend on May 16, 2017 9:41:44 AM

One of the most frequent questions I receive about encryption in the AWS cloud is “Who owns the encryption keys in the cloud?” and “Does Amazon have access to my keys?” I understand why this is a confusing question. I also understand why the question is important to many Enterprise customers. Cloud service providers don’t like to talk about this very much, so let’s spend some time running this to ground. We’ll start with the question about Amazon’s access to your encryption keys.

How to Meet Best Practices for Protecting Information in AWS by Stephen WynkoopAmazon Web Services provides two encryption key management options:

  • AWS Cloud HSM
  • AWS Key Management Service (KMS)

The answer to the question of key ownership depends on which service you are using. Let’s deal with the easy one first.

The AWS Cloud HSM is a physical hardware security module (HSM) that is dedicated to you. It is physically located in an AWS regional cloud data center, but only you have administrative access to the key server. Amazon is clear on the topic of encryption key ownership with the Cloud HSM service: Only you have access to the keys. Of course, Amazon has physical access to the HSM in their data center, but I think we can trust Amazon’s claim that only you have access to the actual encryption keys in a Cloud HSM server.

Now let’s turn our attention to the AWS Key Management Service, or KMS. This is a multi-tenant service provided by Amazon which is backed by an Amazon hardware security module. That is, Amazon creates a key that is only used by you, but that key is protected (encrypted) by an Amazon managed HSM. When you create a key in KMS it is called a Customer Master Key, or CMK. The CMK is actually a data structure that contains your symmetric key and meta data about the key. The CMK is protected by an Amazon HSM key. So, the answer to the question about who owns your key is straight-forward: You and Amazon share ownership of the encryption key and that ownership is equal. You both can access the raw encryption key.

Recently Amazon introduced a new “Bring Your Own Key” option for the KMS service. Does this change anything about who has access to the key? No, you are bringing your own encryption key and loading it into the AWS KMS service as a part of a CMK, but it is still protected in the KMS service by an Amazon HSM key. This means that you and Amazon share equal ownership of the key and both of you have access to the key. The only difference with Bring Your Own Key is that you retain the actual value of the encryption key outside of the AWS cloud.

So, to summarize: The AWS Cloud HSM service provides dedicated encryption keys that only you have access to. The AWS Key Management Service provides encryption keys and both you and Amazon have access to the key.

So, why is this important? Here are some comments that Enterprise customers have shared with me:

In almost every country both law enforcement and national security agencies have legal means to compel a cloud service provider to surrender data related to cloud customers. Certainly in the US this is the case, and it is true in most other countries. In the US it is additionally true that law enforcement and national security agencies may access this information and prohibit the cloud service provider from notifying you – the customer – that access has been granted. Cloud service providers like Amazon and others naturally abide by the laws of the countries in which they operate. But this means that your encryption keys in AWS KMS can be surrendered without your knowledge. You may not like this aspect of your country’s legal system, but it is a fact of life.

Why is this of concern to Enterprise customers? It is because significant law enforcement or intelligence service activity concerning your employees or customers may take place without your knowledge. If you are an executive in a large Enterprise you really want to know if there are potential problems in your workforce, or significant problems with a customer or service provider. Questions like these might arise:

  • Do I have an employee engaging in illegal activity?
  • Do I have multiple employees colluding together to engage in illegal activity?
  • Is one of my customers engaging in criminal activity that may compromise my business?
  • Are there managers in my organization that are breaking the law?
  • Is there some activity that may significantly damage my business reputation?
  • How can I deal with a problem if I don’t know about it?

When your IT systems are physically located in your data center law enforcement and intelligence agencies have to contact you to get access to data. That is not the case in the cloud – you will be in the dark in many cases.

In my experience Enterprise customers will cooperate with their legal requirement to provide data to law enforcement. This is not a question of cooperating with legal requirements to surrender data in a criminal investigation. But Enterprise customers really want to know when significant legal events take place that affect their organizations.

The critical concern is visibility of law enforcement and intelligence service activity that affects you. For this reason many Enterprise customers will not use the AWS Key Management Service. And because they do not have physical access to the Amazon Cloud HSM devices, they will not use this dedicated encryption key management service either.

I hope this clarifies some of the issues related to the Amazon key management options. Of course, these issues are not exclusive to Amazon, the same issues are relevant to the Microsoft, IBM and Google cloud platforms. There are good alternative options to cloud encryption key management services and we will cover those in a separate blog.

Patrick

How to Meet Best Practices for Protecting Information in AWS by Stephen Wynkoop

Topics: Amazon Web Services (AWS), Encryption Key Management

Financial Services and Creating a Security Strategy

Posted by Patrick Townsend on May 9, 2017 9:04:17 AM

I recently spent the better part of an hour talking to a new IT director for a small financial services company. He was feeling overwhelmed at the scope of the work ahead of him, and was bemoaning the lack of any guidance on how to start. The set of tasks in front of him seemed gargantuan in terms of the number of tasks and the scope of work. I can understand that sense of panic when you realize that you are behind the curve and that your organization is facing real threats. I want to share with you some of the advice I gave this IT director (with a tip of the hat to all of those hard working security professionals who’ve shared with me!).

It’s a Process, Not a Destination

Compliance Ready Encryption for Financial ServicesThe first error I see many IT managers make is that they look at security as a set of tasks to accomplish rather than a set of new IT processes. We technical folks really like to make a task list and check them off. We have a sense of accomplishment when we get to the end of the list. It’s done! Hallelujah!

Sorry, security is never done. It is important to realize that a security program means that many people throughout your organization are going to be doing things differently, and will be adjusting to new threats over time. For example, we used to think that the use of strong passwords was adequate to protect our access to corporate web services. But it isn’t enough now. Now we have to use multi-factor authentication in addition to strong passwords. Why? The attacks on password protected assets has become more sophisticated. We have to step up our game. And this is true across a number of security practice areas.

If you are successful you will be changing how your organization OPERATES over time. Not just completing a set of tasks.

Know Where Your Sensitive Data Is

It is very common that businesses do not actually know where their sensitive data resides in the organization, and where it goes outside of the organization. Business are always undergoing change to meet new objectives, counter emerging competitive threats, accommodate new technologies, and comply with new compliance regulations. Managing a business is like fighting a war on many fronts – it is barely organized chaos!

It is understandable then that an IT organization may not have a clear map of its critical data. But you will need that map before you can start really protecting those assets. For example, you might have data extracts uploaded to your web site for customers but not know that the upload process was put in place 5 years ago and the development has moved on. That sensitive data just gets uploaded every night and might not be properly protected.

It’s time to do some archeology.

Be sure you have an inventory of all of your critical applications along with the data the process. This is going to seem like a tedious job, but it will be critical to everything you do. Make the map and then hold a celebration and invite your executive team.

In the process don’t forget the data feeds. Document every place that data enters your organization from the outside, and where you send data to outside services.

Find a Dynamic Security Framework

Now you need a plan! Fortunately you won’t have to figure out a plan on your own. There are several good sources of dynamic security planning guides that you can use as a starting point. A good plan will cover the essential security tasks, and will prioritize them by importance. A complete plan with prioritized tasks will help you focus your attention in the right areas!

Here are some sources for security plans that you can access and use right away:

The great thing about these security plans and frameworks is that you can get started with them very quickly. For example, the CIS Critical Security Controls is available as an Excel spreadsheet. You can make a copy and start working through the sections from top to bottom.

Do the Important Things First

We are sometimes tempted to do some of the easy things first in order to convey a level of accomplishment to our management team. I recommend that you try to resist this tendency as much as possible. Start with the most important items in your priority list and tackle those first. They often give you a lot of security benefit and many do not require a lot of investment or work. It is important to do the most effective and critical tasks first.

Get Your Management Buy-in

Security takes commitment, human resources, financial resources, and much more. You will need to get your management buy-in as quickly as possible. Start by sharing some stories from other companies in the financial services segment. We don’t necessarily want to scare our managers, but they need to have a realistic idea of the threat.

Educating your management team means explaining your need for budget resources. Some things can be done on the cheap, and you won’t want to overlook inexpensive steps to take that improve security. But some things are going to take some budget dollars to deploy. For example, continuous monitoring of system logs with a SIEM solution is one of the most effective security strategies you can deploy. But this will almost certainly mean the deployment of a commercial SIEM solution and this will require fiscal expenditures.

Any steps you take to educate your management team will be worth the effort.

Don’t Forget About Employee Education

Remember that you live in the security world, but the employees in your organization don’t. They are not likely to be up to date on the latest threats. Educating employees on how to identify spam email messages has a lot of benefits. Find ways to work in a few minutes each week into employee schedules a simple security awareness exercise.

You’ve probably heard of Bug Bounties – how about providing some small rewards to employees that discover and report spam emails with potentially harmful content? It is amazing how effective programs like this are.

Rinse and Repeat

Let’s go back to that first point. A security program is something that changes how you and your colleagues live your professional lives – it is not a set of checkboxes. Create an annual calendar of security tasks and review points. Make sure that this includes periodic reviews with the upper management team. If you are doing this right you will be making periodic adjustments to the security program and things that are important today may be eclipsed by new threats tomorrow. That’s not a particularly happy thought, but if you keep adjusting you will be in a safer position.

Finally, we make progress one step at a time. Once you start down this road it will get easier as you progress. Good luck with your new security programs!

Patrick

Compliance

Topics: Data Security, Security Strategy

Splunk, Alliance LogAgent, and the LEEF data format

Posted by Patrick Townsend on Apr 18, 2017 7:09:08 AM

We have a lot of Enterprise customers deploying our Alliance LogAgent solution for the IBM i server to send security events to Splunk. On occasion a customer will deploy Alliance LogAgent and send data in the Log Event Extended Format (LEEF) to Splunk. The LEEF format is the preferred log data format for the IBM Security QRadar SIEM, so I’ve always found this a bit puzzling.

IBM i Security: Event Logging & Active MonitoringThe light finally came on for me this week.

Security event information in syslog format (see RFC 3164) is largely unstructured data. And unstructured data is hard for SIEM solutions to understand. Here is an example from an Apache web server log:

[Wed Oct 11 14:32:52 2000] [error] [client 127.0.0.1] client denied by server configuration: /export/home/live/ap/htdocs/test

An SIEM administrator would have to do a fair amount of work configuring the SIEM to properly understand the importance of this message and take the proper action. If only the data was in some type of normalized format!

It turns out that the IBM Security QRadar LEEF format normalizes the system log information. A message like the above might look something like this in LEEF format:

date=20001011 time=143252 ipAddress=127.0.0.1 violation=client denied severity=5 path=/export/home/live/ap/htdocs/test

With field definitions like “date” and “time” the Splunk SIEM can easily digest the message and the Splunk query tools work great. It is easy to create reports and dashboards with this type of normalized data. The LEEF format is really good about this and Alliance LogAgent supports the LEEF definition.

What most Splunk administrators do not realize is that our Alliance LogAgent solution normalizes all IBM i security events in this type of normalized fashion. That is, this format is the default data format for security events. This is already what Alliance LogAgent does for IBM i security events!

When we started the development of Alliance LogAgent more than 10 years ago we understood at the outset that system log data would be hard for a SIEM to parse. So from the first release of our solution we provided data in this normalized format. Whether you are using Splunk, LogRhythm, Alert Logic, or any other SIEM we make it really easy for the SIEM to digest and act on the information. And forensic, query, and dashboards are easy to create.

So, Splunk users - listen up! The default system log format in Alliance LogAgent is exactly what you need to make Splunk work really well. You can use the LEEF format if you really want to, but you have all of the benefits of normalized data with the default format.

Here at Townsend Security we are vendor neutral when it comes to SIEM solutions. Our customers deploy a wide range of solutions including Splunk, IBM QRadar, LogRhythm, Alert Logic, SolarWinds, McAfee, and lots more. And they can move from one SIEM to another without changing their Alliance LogAgent configurations. We believe that actively monitoring system logs in real time is one of the most important security steps you can take. Early detection of a problem is so much better than trying to remediate a breach after the fact.

Patrick

IBM i

Topics: Alliance LogAgent, Splunk

SQL Server Column Level Encryption

Posted by Patrick Townsend on Feb 28, 2017 9:11:00 AM

Microsoft customers attempting to meet security best practices, compliance regulations, and protection of organization’s digital assets turn to encryption of sensitive data in Microsoft SQL Server databases. The easiest way to encrypt data in SQL Server is through Transparent Data Encryption (TDE) which is a supported feature in SQL Server Enterprise Edition. For a variety of reasons, TDE may not be the optimal solution. Microsoft customers using SQL Server Standard, Web, and Express Editions do not have access to the TDE feature. And even when using SQL Server Enterprise Edition, TDE may not be the best choice for very large databases.

Encryption & Key Management for SQL Server - Definitive GuideLet’s look at some approaches to column level encryption in SQL Server. The following discussion assumes that you want to meet encryption key management best practices by storing encryption keys away from the protected data, and retain full and exclusive control of your encryption keys.

Column Level Encryption (aka Cell Level Encryption) 
Starting with the release of SQL Server 2008, all Enterprise editions of the database have supported the Extensible Key Management (EKM) architecture. The EKM architecture allows for two encryption options: Transparent Data Encryption (TDE) and Column Level Encryption (CLE). Cell Level Encryption is the term Microsoft uses for column level encryption. SQL Server Enterprise edition customers automatically have access to column level encryption through the EKM architecture.

Encryption Key Management solution providers can support both TDE and Column Level Encryption through their EKM Provider software. However, not all key management providers support both - some only support TDE encryption. If your key management vendor supports Cell Level Encryption this provides a path to column level encryption in SQL Server Enterprise editions.

Application Layer Encryption
Another approach to column level encryption that works well for SQL Server Standard, Web, and Express editions is to implement encryption and decryption at the application layer. This means that your application performs encryption on a column’s content before inserting or updating the database, and performs decryption on a column’s content after reading a value from the database. Almost all modern application languages support the industry standard AES encryption algorithm. Implementing encryption in languages such as C#, Java, Perl, Python, and other programming languages is now efficient and relatively painless.

The challenge that developers face when implementing encryption at the application layer is the proper protection of encryption keys. Security best practices and compliance regulations require a high level of protection of encryption keys. This is best accomplished through the use of an encryption key management system specifically designed to create, securely store, and manage strong encryption keys. For developers, the primary challenge in a SQL Server encryption project is integrating the application with the key manager. Many vendors of key management systems make this easier by providing Software Development Kits (SDKs) and sample code to help the developer accomplish this task easily.

SQL Views and Triggers with User Defined Functions (UDFs)
Another approach to column level encryption involves the use of SQL Views and Triggers. Leveraging the use of User Defined Functions (UDFs) the database administrator and application developer can implement column level encryption by creating SQL Views over existing tables, then implementing SQL Triggers to invoke user defined functions that retrieve encryption keys and perform encryption and decryption tasks. This approach has the advantage of minimizing the amount of application programming that is required, but does require analysis of the SQL database and the use of User Defined Functions. Database administrators and application developers may be able to leverage the SDKs provided by an encryption key management solution to make this process easier.

SQL Server Always Encrypted
One promising new technology recently implemented by Microsoft is SQL Server Always Encrypted. This feature is new with SQL Server 2016 and can work with any edition of SQL Server. It is a client-side architecture which means that column data is encrypted before it is sent to the database, and decrypted after it is retrieved from the database. While there are many constraints in how you can put and get data from SQL Server, it is a promising new technology that will help some customers protect data at the column level. You can expect to see support for Always Encrypted being announced by encryption key management vendors in the near future.

SQL Server in the Azure Cloud
As Microsoft customers and ISVs move to the Azure cloud they are taking their SQL Server applications with them. And it is very common that they take full implementations of SQL Server into their Azure virtual cloud instances. When SQL Server applications run in a virtual machine in Azure they support the same options for column level encryption as described above. This includes support for Cell Level Encryption through the EKM Provider architecture as well as application layer encryption. As in traditional IT infrastructure the challenge of encryption key management follows you into the Azure cloud. Azure customers should look to their encryption key management vendors to provide guidance on support for their key management solution and SDKs in Azure. Not all key management solutions run in Azure and Azure is not a supported platform for all vendor SDKs.

Azure SQL Database
In the Azure cloud Microsoft offers the SQL Server database as a cloud service. That is, Microsoft hosts the SQL Server database in the cloud and your applications can use this service rather than a full instance of SQL Server in your cloud instance. Unfortunately, Azure SQL Database only supports Transparent Data Encryption through the EKM Provider interface and does not yet support Cell Level Encryption. It also restricts encryption key management to only the Azure Key Vault facility requiring you to share key custody with Microsoft.

Column level encryption at the application layer is fully supported for Azure SQL Database. As in the traditional IT infrastructure your C#, Java, and other applications can encrypt and decrypt sensitive data above the database level. Again, check with your key management solution provider to insure that application level SDKs are supported in the Azure cloud.

AWS Cloud and SQL Server
The Amazon Web Service (AWS) implementation of cloud workloads parallels that of Microsoft Azure. You can deploy a full instance of SQL Server in an AWS EC2 instance and use the features of SQL Server as in traditional IT infrastructure. Amazon also overs a database service called Amazon Relational Database Service, or RDS. The RDS service offers multiple relational databases including SQL Server. As with Azure there is no support for key management solutions other than the Amazon Key Management Service (KMS) requiring a shared implementation of key custody.

As you can see there are many ways to implement column level encryption in SQL Server and use good encryption key management practices. I hope this helps you on our journey to more secure data in SQL Server.

Patrick

Encryption

Topics: Encryption, SQL Server, Cell Level Encryption

Fixing the TDE Key Management Problem in Microsoft SQL Server

Posted by Patrick Townsend on Jan 10, 2017 7:31:56 AM

Many Microsoft SQL Server users have taken the first step to protect sensitive data such as Personally Identifiable Information (PII), Protected Health Information (PHI), Primary Account numbers (PAN) and Non-Public Information (NPI) by encrypting their databases with Transparent Data Encryption (TDE). It is extremely easy to implement TDE encryption as it does not require program changes.

Encryption and key management for SQL ServerA common cause of audit failures might not be so obvious and that is the failure to properly protect the SQL Server key encryption key once you activate encryption in SQL Server. With Transparent Data Encryption you have the choice of storing the service master key within the SQL Server context itself, or protecting the master key with a key management system using the SQL Server Extensible Key Management (EKM) interface. Why is it important to do this?

It turns out that it is easy for cyber criminals to recover the SQL Server master key when it is stored within SQL Server itself. (Examples: https://blog.netspi.com/decrypting-mssql-credential-passwords/ and http://simonmcauliffe.com/technology/tde/#hardware)

Simon McAuliffe provides the clearest explanation I’ve seen on the insecurity of locally stored TDE keys in SQL Server. I don’t agree with him on the question of using a key manager to improve security. Given that there is no perfect security, I believe that you can get significant security advantages through a properly implemented key management interface.

If your TDE keys are stored locally, don’t panic. It turns out to be very easy to migrate to a key management solution. Assuming you’ve installed our SQL Server EKM Provider called Key Connection on your SQL Server instance, here are the steps to migrate your Service Master Key to key management protection using our Alliance Key Manager solution. You don’t even need to bring down SQL server to do this (from the Alliance Key Manager Key Connection manual):

Protecting an existing TDE key with Alliance Key Manager

First create a new asymmetric key pair within the AKM Administrative Console using the “Create EKM Key” and the “Enable Key for EKM” commands.

Then return to SQL Server and call the following command to create the asymmetric key alias for the new KEK that you created on the AKM server:

use master;

create asymmetric key my_new_kek from provider KeyConnection with provider_key_name = ’NEW_TDE_KEK’, creation_disposition = open_existing;

In this example, NEW_TDE_KEK is the name of the new key on AKM, and my_new_kek is the key alias.

Then use the ALTER DATABASE statement to re-encrypt the DEK with the new KEK alias assigned in the previous statement:

ALTER DATABASE ENCRYPTION KEY

ENCRYPTION BY SERVER

   {  ASYMMETRIC KEY my_new_kek}

Note that you do not have to take the database offline to perform this action.

Of course, there are other steps that you should take to secure your environment, but I wanted to demonstrate how easy it is to make the change.

The SQL Server DBA and the network administrator will have lots of other considerations in relation to SQL Server encryption. This includes support for clustering and high availability, automatic failover to secondary key servers, adequate support for separation of duties (SOD) and compliance, and the security of the credentials needed to validate SQL Server to the key manager. All of these concerns need to be addressed in a key management deployment.

For SQL Server users who deploy within a VMware or cloud infrastructure (AWS, Azure), Alliance Key Manager can run natively in your environment, too. It does not require a hardware security module (HSM) to achieve good key management with SQL Server. You have lots of choices in how you deploy your key management solution.

It turns out not to be difficult at all to address your SQL Server encryption key insecurities!

Patrick

White P

Topics: SQL Server, Transparent Data Encryption (TDE)

OpenSSH on the IBM i and Your Security

Posted by Patrick Townsend on Jan 3, 2017 7:45:48 AM

Lately I’ve seen some criticism of the OpenSSH implementation on the IBM i platform which seems to imply that using a third-party implementation of the Secure Shell (SSH) file transfer application is better without the IBM no-charge licensed OpenSSH implementation. I disagree with that opinion and think there are good security and implementation reasons to stick with the IBM OpenSSH implementation.

Here are some reasons why I like OpenSSH:

OpenSSH is supported by an global open source community

Tatu Ylönen founded SSH Communications in 1995 and produced the first versions of an open source SSH implementation. Since 1999 the OpenSSH application has been maintained by the OpenBSD Project which is funded by the OpenBSD foundation and managed by Theo de Raadt. OpenSSH is available on a wide variety of operating systems including the IBM i where it is deployed as a no-charge licensed product and maintained by IBM.  OpenSSH continues to be actively developed and new encryption algorithms have been added recently.

OpenSSH is a widely used by large and small organizations

By some estimates the OpenSSH implementation of the SSH protocol and applications commands a 97 percent market share for SSH implementations. This means that OpenSSH is in wide use by large and small organizations to securely manage their eCommerce needs. This also means that OpenSSH receives a lot of scrutiny by compliance and security experts. Widely deployed solutions tend to get more scrutiny from security experts, and this is true for OpenSSH.

OpenSSH is secure

No application is immune to security challenges. However, OpenBSD and the OpenSSH application in particular have a stellar record for security. With security products, deep expertise and commitment matter. OpenSSH started with security as a leading goal by its developers and it shows. Over the last few years there have been fewer than a dozen security issues, and most were unlikely to be exploited and all were patched rapidly through updates by IBM. The OpenBSD set of applications that include OpenSSH have a great record on security. If you think the IBM i platform has a good security record, take a look at OpenSSH.

IBM provides technical support for OpenSSH

We have all developed a deep appreciation for IBM’s commitment to security over the years. It is one of great values of the IBM i platform. As new vulnerabilities are discovered you need to have a reliable and timely source of patches and enhancements and IBM has stood behind this critical application. Security notifications are managed by IBM so that you know when you need to do an update. By making OpenSSH a no-charge licensed program IBM i customers get patches through the normal PTF update process. Do you know any third-party IBM i vendor with an equal commitment to notification, maintenance and patching? IBM has earned our trust through this process.

OpenSSH is PCI compliant

PCI Qualified Security Assessors (QSAs) like Coalfire, TrustWave and others recognize that a properly patched implementation of OpenSSH meets PCI Data Security Standards (PCI-DSS) compliance, and IBM also tracks OpenSSH for PCI compliance. This again reflects IBM’s and OpenBSD’s commitment to security. If you are using a third-party IBM i solution for SSH how well is it tracked by the PCI audit community?

SSH is a complex protocol

Bruce Schneier said “Complexity is the enemy of security.” SSH is a complex protocol and this means that extra care needs to be taken in its development, deployment and maintenance. No third-party SSH solution rises to the level of care taken by the OpenSSH community and by IBM. Almost every business depends on secure file transfer for daily business operations. Deploying the most secure SSH solution is a critical security step.

OpenSSH does not use OpenSSL or Java JSSE

We’ve read a lot over the last few months about security issues in OpenSSL and Java. Many IBM i customers are confused about the relationship between OpenSSH and OpenSSL. In fact, OpenSSH does not use the OpenSSL library for communications. This means that OpenSSH was not subject to the HeartBleed and other OpenSSL vulnerabilities. We are all also now painfully aware of the security issues in Java. Most browsers no longer allow Java plugins for this reason. Third-party SSH products may or may not use OpenSSL or Java for communications. If you are running a third-party IBM i SSH solution, do you know if it uses OpenSSL or Java?

Third-party SSH solutions provide no significant advantage over OpenSSH

OpenSSH is a secure, reliable, and resilient implementation of SSH for secure data transfer that is backed by IBM and a worldwide community of users and developers. Our Alliance FTP Manager solution fully integrates with the IBM i OpenSSH application for secure, automated and managed file transfer. Our solution automates the OpenSSH transfer of hundreds of thousands of file transfers every day without compromising security.

My opinion? You probably don’t need more IT risk in your life. Stick with OpenSSH for your security needs. You will be in good company.

Patrick

Webinar: Secure Managed File Transfer on IBM i

Topics: IBM i, Secure Managed File Transfer

New York Department of Financial Services (NYDFS) and Encryption - 8 Things to Do Now

Posted by Patrick Townsend on Dec 12, 2016 10:27:38 AM

The New York Department of Financial Services (NYDFS) surprised the financial services industry by fast tracking new cybersecurity regulations in September of 2016. Due to go into effect in January of 2017 with a one-year transition period, it takes a very prescriptive approach to cybersecurity which includes a mandate to encrypt data at rest. The financial sector is broadly defined as banks, insurance companies, consumer lenders, money transmitters, and others. The law is formally known as 23 NYCRR 500 and you can get it here.

encryption-key-management-simplifiedThere isn’t much wiggle room on the requirement for encrypting sensitive data. You can use compensating controls if you can show that encryption is “infeasible”. But I am not sure how you would show that. All modern database systems used by financial applications support encryption. It would be hard to imagine a financial database where encryption would not be feasible. Don’t plan on that being an excuse to delay encrypting data at rest!

The time frame is short for implementing the encryption mandate. One year seems like a long time, but it is extremely aggressive given the development backlog I see in most banks.

Here are some things you should start doing right now:

1) Inventory All of Your Financial Systems

This seems like a no-brainer, but you might be surprised how many organizations have no formal inventory of their IT systems that contain financial data. This is a top-of-the list item on any cybersecurity list of recommendations, so making or updating this list will have a lot of benefits.

2) Document Storage of All Sensitive Information (Non-Public Information, or NPI)

For each system in your inventory (see above) document every database and storage mechanism that stores NPI. For database systems identify all tables and columns that contain NPI. You will need this documentation to meet the NYDFS requirements, and it is a roadmap to meeting the encryption requirements.

3) Prioritize Your Encryption Projects

You won’t be able to do everything at once. Following all modern cybersecurity recommendations, prioritize the systems and applications that should be addressed using a risk model. Here are a few factors that can help you prioritize:

  • Sensitivity of data
  • Amount of data at risk
  • Exposure risk of the systems and data
  • Compliance risk
  • Operational impact of loss

It is OK to be practical about how you prioritize the systems, but avoid assigning a high priority to a system because it might be easiest. It is better to tackle the biggest risks first.

4) Establish Encryption Standards

Be careful which encryption algorithms you use to protect sensitive data. In the event of a loss you won’t want to be using home-grown or non-standard encryption. Protect data at rest with NIST compliant, 256-bit AES encryption. This will give you the most defensible encryption strategy and is readily available in all major operating systems such as Windows, Linux, and IBM enterprise systems.

5) Establish Key Management Standards

Protecting encryption keys is the most important part of your encryption strategy and the one area where many organizations fail. Encryption keys should be stored away from the encrypted financial data in a security device specifically designed for this task. There are a number of commercial key management systems to choose from. Be sure your system is FIPS 140-2 compliant and implements the industry standard Key Management Interoperability Protocol (KMIP).

Hint: Don’t fall into the project-killing trap of trying to find a key management system that can meet every key management need you have in the organization. The industry just isn’t there yet. Pick a small number of key management vendors with best-of-breed solutions.

With encryption standards well defined and an encryption key management strategy in hand you are ready to get started with your encryption projects.

6) Analyze Performance and Operational Impacts

Encryption will naturally involve some performance and operational impacts. Encryption is a CPU intensive task, so plan on doing some performance analysis of your application in real-world scenarios. If you don’t have test environments that support this analysis, get started now to create them. They will be invaluable as you move forward. Modern encryption is highly optimized, and you can implement encryption without degrading the user experience. Just be prepared to do this analysis before you go live.

There are also operational impacts when you start encrypting data. Your backups may take a bit more storage and take longer to execute. So be sure to analyze this as a part of your proof-of-concept. Encrypted data does not compress as well as unencrypted data and this is the main cause of operational slow-downs. For most organizations this will not be a major impact, but be sure to test this before you deploy encryption.

8) Get Started

Oddly (to me at least) many organizations just fail to start their encryption projects even when they have done the initial planning. A lack of commitment by senior management, lack of IT resources, competing business objectives, and other barriers can delay a project. Don’t let your organization fall into this trap. Do your first project, get it into production, and analyze the project to determine how to do it better as you move forward.

Fortunately we have a lot of resources available to us today that were not available 10 years ago. Good encryption solutions are available and affordable for traditional on-premise environments, for VMware infrastructure, and for cloud applications.

You can meet the NYDFS requirements and timelines if you start now. But don’t put this one off.

Patrick

 

Resources:

New York Department of Financial Services:

http://www.dfs.ny.gov/legal/regulations/proposed/propdfs.htm

 

Harvard Law School analysis of NYDFS:

https://corpgov.law.harvard.edu/2016/09/24/nydfs-proposed-cybersecurity-regulation-for-financial-services-companies/

The Encryption Guide eBook

 

Topics: Compliance, Encryption

IBM i Security Architecture & Active Monitoring

Posted by Patrick Townsend on Dec 6, 2016 7:30:42 AM

Excerpt from the eBook "IBM i Security: Event Logging & Active Monitoring - A Step By Step Guide."


IBM i Security: Event Logging & Active MonitoringActive monitoring (sometimes referred to as Continuous Monitoring) is a critical security control for all organizations. And it is one of the most effective security controls you can deploy. The large majority of security breaches occur on systems that have been compromised days, weeks, or even months before sensitive data is lost. A recent Verizon Data Breach Investigations Report1 indicates that a full 84 percent of all breaches were detected in system logs. This is why the Center for Internet Security includes active monitoring as a Critical Security Control (Control number 6).

There are several elements of a truly effective Active Monitoring strategy:

Central Collection and Repository of All Events
Attackers almost never start with your core IBM i server directly. They attack a web application or infect a user PC and work their way into the IBM i server. A defensible active monitoring strategy has to collect events from across the entire organization. By the time they show up on your IBM i server they have probably compromised a number of intermediate systems and an opportunity to prevent the breach has been missed. Collect all events across your entire IT infrastructure to gain the best early detection opportunities.

Real Time Event Collection
Data breaches are happening much faster than in the past. In some cases the loss of data happens just minutes after the initial breach. This means that you must collect security events in real time. Good active monitoring solutions are able to digest threat information
in real time and give you the chance to deter them. Avoid batch event collection – you can collect IBM i security audit journal information in real time and you should.

Event Correlation
Event correlation is key to an effect active monitoring solution. This is typically accomplished through the use of special software implemented in Security Information and Event Management (SIEM) solutions. Highly automated SIEM solutions have the ability to correlate events across a large number of systems and automatically identify potential problems. They do exactly what we want computer systems to do – handle large amounts of data and apply intelligent interpretation of the data.

Anomaly Detection
Anomaly detection is another aspect of active monitoring. That unusual system login at 3:00am on a Sunday morning would probably escape the attention of our human IT team members, but good active monitoring solutions can see that anomalous event and report on it.

Alerting and Resolution Management
When a problem is discovered we need to know about it as soon as possible. A good active monitoring solution will inform us through a variety of alerting channels. Emails, text, dashboards and other mechanisms can be deployed to bring attention to the problem - and we need to be able to track the resolution of the event! We are all processing too much information and it is easy to forget or misplace a problem. 

Forensic Tools
Forensic tools are critical to an active monitoring solution as they enable the rapid analysis of an attacker’s footprints in our system. The key tool is an effective and easy-to-use query application. Log data can include millions of events and be impossible to inspect without a good query tool. The ability to save queries and use them at a later time should also be a core feature of your forensic toolset.

IBM i


Topics: System Logging, IBM i

 

 

Subscribe to Email Updates

Posts by Topic

see all