Feel free to call us toll free at +1.800.357.1019.

If you are in the area you can reach us at +1.360.359.4400.

Standard support
6:30am - 4:00pm PST, Monday - Friday, Free

Premium support
If you own Townsend Security 24x7 support and
have a production down issue outside normal
business hours, please call +1.800.349.0711
and the on-call person will be notified.

International customers, please dial +1.757.278.1926.

Townsend Security Data Privacy Blog

Patrick Townsend

Recent Posts

Customer Information: IBM i Security Products Acquired by Syncsort

Posted by Patrick Townsend on May 21, 2018 1:11:00 PM

We are pleased to announce that Syncsort has acquired Townsend Security's IBM i security offerings. Solutions in the sale include Townsend Security's encryption, tokenization, authentication, FTP, as well as SIEM integration products. Syncsort is the global leader in Big Iron to Big Data software. The acquisition helps Syncsort deliver on its ongoing commitment to strengthening the security of the IBM i platform and to businesses that rely on its performance and resilience to drive the business forward.  

Townsend Security will continue in business focusing on Alliance Key Manager, our FIPS 140-2 compliant encryption key management solution that is in use by over 3,000 customers worldwide. Alliance Key Manager provides encryption key management and encryption services to a wide set of relational database solutions, big data solutions, and blockchain platforms. We are pleased that we will have an on-going partnership with Syncsort around providing the Townsend Security Alliance Key Manager solution to Syncsort customers.

During the transition, Townsend Security will work closely with Syncsort to make sure that your IBM i product needs are met. As part of the acquisition, we will be sharing your account information with Syncsort and anticipate a full handover within 6 months.

To learn more about Syncsort’s IBM i products and content offers, you can sign up here to be added to their mailing list.

We thank you for your business and trust you will continue to experience the same high level of customer satisfaction and support with Syncsort.

Patrick Townsend

Topics: Customer Information

SQL Server Always Encrypted vs Transparent Data Encryption (TDE)

Posted by Patrick Townsend on Apr 30, 2018 10:42:39 AM

Microsoft SQL Server customers ask us whether they should use Always Encrypted or Transparent Data Encryption (TDE) to protect sensitive data. It is a relevant question, especially given the concern for the GDPR, and it is relevant to many other compliance regulations and data protection needs. Let’s explore these technologies in more detail and I think the answer will emerge.

Always Encrypted

Encryption & Key Management for SQL Server - Definitive GuideAlways Encrypted is a relatively new, client-side implementation of encryption for SQL Server. That is, data is encrypted on the source system before you insert it into a SQL Server database. Implemented as a Windows driver, Always Encrypted intercepts your SQL statement before it leaves your client-side system (PC, web server, etc.), determines if there are any fields in the SQL statement that need to be encrypted, establishes or retrieves an encryption key, encrypts the field, and sends the modified SQL statement to SQL Server for processing.

One advantage of this approach is that the data is encrypted as it travels over the internal or external network, and as it resides in the database. A retrieval of the encrypted data reverses this process ensuring that the data is protected in transit.

One significant limitation of Always Encrypted is that it can only be applied to a small subset of SQL operations. Many SQL operations are complex and cannot be processed by Always Encrypted.

Transparent Data Encryption and Cell Level Encryption

SQL Server Transparent Data Encryption (TDE) and Cell Level Encryption (CLE) are server-side facilities that encrypt the entire SQL Server database at rest, or selected columns. The client-side application is completely unaware of the implementation of TDE or CLE and no software is installed on the client-side system. All of the encryption tasks are performed by the SQL Server database itself. It is easy to implement and performs very well for most SQL Server customers.

TDE and CLE are a part of the Microsoft SQL Server Extensible Key Management (EKM) strategy which has been a part of SQL Server since the 2008 edition. It is a mature technology and many organizations deploy TDE or CLE to protect their sensitive information.

Encryption Key Management

An encryption strategy is only as good as the encryption key management strategy. Creating, protecting, and using encryption keys is the hardest part of encryption, and for the sake of your overall security, it is important to get key management right.

With Transparent Data Encryption the key management strategy is well defined through the Microsoft Extensible Key Management (EKM) Provider interface. Key management systems such as our Alliance Key Manager for SQL Server provide software written to the EKM interface specification and SQL Server customers get full encryption key management integrated through the EKM Provider interface.

Always Encrypted does not provide a formal interface for key management. Instead, third party vendors must provide Always Encrypted drivers that implement an interface to a key management system. This means that the interface to the key management system is proprietary. Additionally, since the Always Encrypted implementation is a client-side deployment, each client-side application will have to have access to the key manager in order to properly protect and use encryption keys. This can be a challenge in distributed network topologies.

The bottom line: you can achieve proper key management with either approach, but expect to deal with more complexity with Always Encrypted when you have distributed clients.

When to Use Always Encrypted

Because Always Encrypted functions by modifying the SQL operation before it interfaces with the SQL Server database, and because many complex SQL operations will not work with Always Encrypted, I would only recommend using Always Encrypted when the application architecture is very simple. For example, you might want to use Always Encrypted to send data from an internal SQL Server database to a web-hosted SQL Server database and application. The data will be protected in transit and encrypted in the database. As long as your web application makes simple SQL queries to the database this approach can work well.

When to use TDE or CLE

Transparent Data Encryption and Cell Level Encryption are implemented directly in the SQL Server database itself. This means that complex SQL operations are fully supported and you will not find limitations imposed by your encryption or application strategy in the future. The SQL Server implementation of TDE and CLE also support a standardized interface for encryption key management. This means you will have choices in key management vendors and can readily find a solution.  TDE also has good performance metrics for the large majority of SQL Server customers, and is exceptionally easy to implement.


For the reasons outlined above I believe that Transparent Data Encryption or Cell Level Encryption implemented within SQL Server will almost always be the better choice for most SQL Server customers. You will not be locked into restrictions on the use of SQL operations, key management is well defined, and the encryption performance metrics are impressive.

I hope this overview of the two SQL Server encryption approaches has been helpful. Please feel free to reach out with any questions you have.



Topics: SQL Server, Transparent Data Encryption (TDE)

GDPR - Do I have to Use Encryption?

Posted by Patrick Townsend on Apr 24, 2018 8:44:17 AM

As the date for the formal implementation of the EU General Data Protection Regulation draws near, many of our customers are struggling with the question of whether or not they have to encrypt sensitive data. Because the fines for violating the GDPR can be quite high, this question is taking on a growing urgency. So, let’s take a look at this question in more detail by looking at the actual GDPR source documents.

Download the EU Data Privacy White PaperThe most relevant part of the GDPR regulation related to encryption is Article 32 - “Security of Processing”. The actual text of the article is very readable and you can find a link in the Resources section below. Here is an extract from Article 32 (emphasis added):

“Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:

  1. the pseudonymisation and encryption of personal data;
  2. the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services;
  3. the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident;
  4. a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.”

Well, it looks like we don’t really have to encrypt the sensitive data because we get to take into account the costs of the implementation and the nature, scope, context and purpose for processing. Along with some other potentially mitigating factors. If you read no further you might draw the conclusion that encryption is a recommendation, but it is not a requirements. Question answered, right?

Not so fast. Let’s dig deeper. The next point in Article 32 shines a brighter light on this question:

“2. In assessing the appropriate level of security account shall be taken in particular of the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed.”

In effect the GDPR is saying that your security controls must account for the risk of accidental, unlawful, or unauthorized disclosure or loss of personal data. That is a very broad category of potential violations of the protection of an individual’s data. Have you ever lost a backup cartridge? Do you really think your systems are secure enough to prevent access by sophisticated cyber criminals?

While on first look it seems that we have some leeway related to the deployment of encryption, GDPR quickly raises the bar on this question. Given the current state of security of data processing systems, no security professional should be absolutely comfortable with the security of their systems.

If you are still thinking you can avoid encrypting sensitive data, be sure to take a read of Recital 78, “Appropriate technical and organisational measures”.

It should be clear by now that if you decide NOT to encrypt sensitive data you should definitely document all of the reasons it is not feasible or practical to do so, and all of the measures you are actually taking to protect that data. Put this in writing and get senior management sign-off on your conclusions.

But there is more.

If you are wondering how serious GDPR is about encryption, be sure to read Recital 83 “Security of processing”. Here is an extract with emphasis added:

“In order to maintain security and to prevent processing in infringement of this Regulation, the controller or processor should evaluate the risks inherent in the processing and implement measures to mitigate those risks, such as encryption. Those measures should ensure an appropriate level of security, including confidentiality, taking into account the state of the art and the costs of implementation in relation to the risks and the nature of the personal data to be protected. In assessing data security risk, consideration should be given to the risks that are presented by personal data processing, such as accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed which may in particular lead to physical, material or non-material damage.

If you are getting the notion that the authors of the GDPR really want you to encrypt sensitive data, you would be right.

Where else does encryption come into play?

There are safe-harbors in GDPR around data breach notification IF you are encrypting sensitive data. The avoidance of notification is not absolute, but here is one relevant section of Article 34, “Communication of a personal data breach to the data subject” (emphasis added):

The communication to the data subject referred to in paragraph 1 shall not be required if any of the following conditions are met:

  1. the controller has implemented appropriate technical and organisational protection measures, and those measures were applied to the personal data affected by the personal data breach, in particular those that render the personal data unintelligible to any person who is not authorised to access it, such as encryption;

If the sensitive data of a data subject is lost and not encrypted, it will be difficult to argue that the information is inaccessible. The loss of unencrypted data will certainly require notification to the supervisory authority and the data subject.

There is one more aspect to the discussion of encryption and that relates to the management of encryption keys. Your encryption strategy is only as good as your ability to protect your encryption keys. This is reflected in Recital 85 “Notification obligation of breaches to the supervisory authority” (emphasis added):

“A personal data breach may, if not addressed in an appropriate and timely manner, result in physical, material or non-material damage to natural persons such as loss of control over their personal data or limitation of their rights, discrimination, identity theft or fraud, financial loss, unauthorised reversal of pseudonymisation, damage to reputation, loss of confidentiality of personal data protected by professional secrecy or any other significant economic or social disadvantage to the natural person concerned. Therefore, as soon as the controller becomes aware that a personal data breach has occurred, the controller should notify the personal data breach to the supervisory authority without undue delay and, where feasible, not later than 72 hours after having become aware of it, unless the controller is able to demonstrate, in accordance with the accountability principle, that the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where such notification cannot be achieved within 72 hours, the reasons for the delay should accompany the notification and information may be provided in phases without undue further delay.”

If you are not properly protecting the encryption key used for encryption, it must be assumed that the encryption can be reversed. Don’t use weak encryption keys such as passwords, don’t store encryption keys in files or in application code. Instead, use a professional key management solution to protect the keys.

Returning to our original question about the need for encryption of sensitive data, I hope you have arrived at Yes as the most responsible answer. The loss of unencrypted sensitive data will definitely trigger the need for data breach notification. And the improper protection of encryption keys will also trigger the need for breach notification. You are at more risk of financial penalties if you are not properly protecting that sensitive information with encryption.

The GDPR is complex and some parts are subject to interpretation. But if you control or process sensitive data you should not underestimate the serious intent of the GDPR to enforce protections for individuals. GDPR is revolutionary and disruptive - it is dangerous to ignore it.


The General Data Protection Regulation (GDPR)
The GDPR Recitals
GDPR Article 32 “Security of Processing"
Recital 78, “Appropriate technical and organisational measures”
Recital 83, “Security of processing”
GDPR Article 34, “Communication of a personal data breach to the data subject”
Recital 85 “Notification obligation of breaches to the supervisory authority”

EU Data Privacy Protections and Encryption

Topics: EU GDPR, Enryption

Migrating from Oracle to MongoDB with Data Security

Posted by Patrick Townsend on Mar 20, 2018 2:38:51 PM

With big cost savings in mind, many Oracle Database customers are migrating selected applications to MongoDB. Let’s take a look at some of the factors that are driving this migration, and how MongoDB helps protect and secure data after the migration.

First, how do customers protect data that is stored in an Oracle Database?

Encryption & Key Management for MongoDB eBookOracle Database supports Transparent Data Encryption (TDE) through a special interface implemented in Oracle Advanced Security. Once you’ve made the costly upgrade to Oracle Advanced Security, you can configure and activate encryption of the Oracle Database through the Advanced Security Wallet application. The encryption support can be either whole database encryption through Transparent Data Encryption or column level encryption. If you are using the Oracle Key Manager (OKM) solution, or a third-party key manager, the interface in Oracle Wallet is usually through a PKCS#11 interface library. This is simply a software library that uses the key manager to protect the local key used for database encryption.

The increase in cost to deploy Oracle Advanced Security can be substantial. It varies depending on your licensing agreement with Oracle or one of their distributors, but it affects the cost basis for each database user.

Faced with the increased cost of deploying Oracle Advanced Security, or wishing to achieve cost savings with existing Oracle Database implementations, Oracle customers are moving to MongoDB Enterprise which has a different security strategy.

So,  how do MongoDB customers protect data in an equivalent fashion?

MongoDB Enterprise edition also supports Transparent Data Encryption of the database. But this is a native part of the Enterprise edition of MongoDB and does not involve additional license costs. Additionally, MongoDB Enterprise overall pricing is substantially lower that Oracle Database [Organizations are saving 70%+ by switching from Oracle to MongoDB]. Without the need for increased costs to get the encryption option, MongoDB customers are able to achieve Transparent Data Encryption in an affordable way.

Additionally, encryption key management is based on the industry standard Key Management Interoperability Protocol (KMIP) as defined by the OASIS standards group. Most enterprise key management systems, including our Alliance Key Manager for MongoDB solution, support KMIP. This means that customers can easily deploy Transparent Data Encryption with an affordable key management solution that plugs into MongoDB through the KMIP interface. This will make security-conscious Oracle Database customers happy in achieving an equivalent level of security after the migration.

MongoDB has been doing a lot lately to woo Oracle Database customers to its platform. While MongoDB is a NoSQL big data database, it supports SQL-like interfaces to the database. This means that migrations from Oracle to MongoDB are pretty straight-forward [A Step-by-Step Guide on How to Migrate from RDBMS to MongoDB].

MongoDB also just announced a key feature that will make Oracle Database customers happy. That announcement is that MongoDB version 4 will fully support Atomicity, Consistency, Isolation and Durability (ACID) as a core feature of MongoDB. This is database geek-speak that means that MongoDB database applications will work with the same reliability as big commercial SQL databases - something that Oracle Database customers expect. I think this will help accelerate the migration from Oracle Database to MongoDB Enterprise database.

Oracle Database customers typically run a wide range of packaged and custom-built applications. I believe that early migrations from Oracle to MongoDB mostly involve those custom-built applications that are relatively simple in their architecture. Migrating from an Oracle ERP or CRM application might involve a great deal of planning and re-engineering. But I think almost every Oracle customer has some custom-built applications that are easy targets for a migration to MongoDB.

The cost savings combined with the ease of migration, the equivalent level of database encryption and key management security, and the new support for ACID database reliability in MongoDB version 4 is a winning combination for MongoDB as a company. I suspect that migrations from Oracle to MongoDB will pick up dramatically this year as MongoDB version 4 roles out.

Here at Townsend Security we’ve done a lot to align with the MongoDB Enterprise strategy. We’ve formally certified our Alliance Key Manager solution with the MongoDB security team, we support both Intel and Power platforms for our key management interface, we deploy in cloud, VMware and on-premise environments, and we’ve aligned our pricing and licensing strategy with the MongoDB strategy. Entry level MongoDB customers can deploy compliant (PCI DSS, FIPS 140-2) key management in an affordable manner, and key management licensing follows the MongoDB model. Even very large MongoDB Enterprise customers will be happy with our key management licensing, scalability, and pricing strategy.

I am impressed with MongoDB’s goal of bringing enterprise level database technology to a wide variety of small, medium, and large Enterprise customers. It’s just the right time for a new, disruptive approach to databases.

If you'd like to learn more about how we are helping secure MongoDB Enterprise with encryption and key management, visit our Alliance Key Manager solution for MongoDB page.


Encryption & Key Management for MongoDB eBook

Topics: MongoDB

High Availability Strategies for MongoDB Encryption Key Management

Posted by Patrick Townsend on Mar 2, 2018 10:27:59 AM

The MongoDB database is designed to be resilient and gives you several options for high availability and business continuity. But what about encryption key management - how do we implement high availability when we’ve deployed a key management system for MongoDB encrypted databases? Here at Townsend Security we encounter this question with MongoDB users on a regular basis. Let me share some of the approaches that we recommend.

We can take a GOOD, BETTER, and BEST view of key management high availability. Not every MongoDB database implementation needs extreme high availability, and there are costs and benefits for different approaches. So let’s run down some of the common options:


Not everyone uses the MongoDB database for mission critical applications, and hot failover is not needed. But, of course, in the event of a disaster that takes down your data center, network, or servers, you want to be able to recover in a reasonable amount of time. In this case, you can rely on the backup and restore capabilities of Alliance Key Manager for MongoDB. Here is a diagrammatic example of a simple case of a primary and secondary MongoDB implementation that share a single key manager:


In this case we deploy a single key manager to serve both the primary and secondary nodes of a MongoDB implementation. After initializing the MongoDB database for encryption, we perform a key manager backup to archive the master encryption key for the primary and secondary nodes of the database.

Recovery of the MongoDB database may involve migration from the secondary node to the primary node when it is back online, or restoration from a backup image, and a restore operation for the key manager. Alliance Key Manager makes it easy to backup the encryption key database and configuration, and to restore from backup when needed. This is the simplest case of key management recovery when hot failover is not needed.

Remember to follow security best practices when backing up Alliance Key Manager. You will want to save the secret keys separately from the data encryption keys, and ensure separation of duties. See the Alliance Key Manager documentation for guidance on backup and restore operations.


For many MongoDB customers the applications built on the MongoDB database represent core, mission critical applications that must be available at all times. These customers need a high availability strategy that guarantees very little loss of availability and rapid recovery. While there are different failover strategies for MongoDB customers, the normal approach would be to failover to a secondary MongoDB node at a geographically independent data center. The primary and secondary nodes would use separate key management systems which would be synchronized. A diagrammatic might look something like this:


The primary MongoDB node has a key manager deployed in its data center or cloud location, and the secondary MongoDB node has a different key manager deployed in its data center or cloud location. The two key managers are mirroring encryption keys in real time in an active-active configuration. Both the MongoDB data and the Alliance Key Manager instances are fully redundant in real time.


Enterprise customers who build mission critical applications on MongoDB databases and who must have full business continuity and high availability failover can achieve this with MongoDB replication and redundant Alliance Key Manager servers. A primary MongoDB node can associate two redundant key servers, and a secondary replicating MongoDB node can associate two different redundant key servers. Since MongoDB configuration only allows for the definition of a single key server, we can use a load balancer to implement the redundant key management pair. A diagram of this configuration would look like this:


With a load balancer placed between the MongoDB database and the two key managers you can achieve hot failover in the event of a lost connection to the first key server without loss of access to the database. When the connection to the main key server is restarted the load balancer will bring it on line. The two Alliance Key Management servers automatically mirror encryption keys to each other in an active-active configuration.

In the event of a full loss of the primary MongoDB database the failover to the secondary MongoDB database will occur by the MongoDB Arbiter. The fully replicated data will be available and the secondary database will be protected by a pair of Alliance Key Manager servers in the same was as the primary MongoDB database.

Note that there can be multiple secondary MongoDB nodes and each can implement a similar key management failover strategy. With the above strategy MongoDB database customers can achieve a very high level of business continuity and high availability failover.

Additional Considerations

MongoDB database deployments vary a great deal in their overall architecture and implementation. This is a testament to the flexibility of the database and its ability to meet a wide variety of customer use case scenarios. Alliance Key Manager for MongoDB can help improve security and recoverability in any MongoDB deployment.

Alliance Key Manager is available has a Hardware Security Module (HSM), VMware software appliance (virtual machine), and as cloud instances. The interface to all of the key managers works in exactly the same way. This means you can create hybrid deployments of MongoDB and Alliance Key Manager across clouds, and between cloud and on-premise deployments.

At the time this blog was written (March 2018) the MongoDB Atlas cloud platform did not support independent third party key management solutions through the KMIP interface. That is likely to change in the future. For Enterprise customers who must achieve exclusive custody of encryption keys, you can deploy MongoDB in a normal cloud instance and use the encryption and key management capabilities of MongoDB with Alliance Key Manager. You can then migrate to the Atlas service when it supports the KMIP interface for key management.

About Alliance Key Manager

Alliance Key Manager for MongoDB is certified by the MongoDB security team, and supports the MongoDB Enterprise pricing model. Regardless of the size of your MongoDB implementation you will find an affordable and easy-to-deploy Alliance Key Manager for MongoDB solution.

Securing Data in MongoDB

Image Credit:
Load balancer created by AlexWaZa from the Noun Project
Key created by icon 54 from the Noun Project

Topics: MongoDB Encryption

IBM i FieldPROC Encryption, IBM Query, and Encrypted Indexes

Posted by Patrick Townsend on Jan 29, 2018 8:31:08 AM

The IBM i DB2 database team implemented column level encryption through Field Procedures (FieldProc) in release 7.1 of the IBM i operating system. It was a great step forward for those IBM i customers who need to encrypt sensitive data to meet compliance regulations and to improve overall security. With release 7.1 it was now possible to implement DB2 encryption without modifying user applications.

Key Management for IBM i - Audit FailuresPrior to this enhancement to DB2 in release 7.1, implementing encryption could be a laborious process of modifying applications and performing extensive regression testing. This approach did not work well with some types of fields (date, time, etc.) and many IBM and third-party query utilities just did not work correctly. So the DB2 enhancement for Field Procedures was a great step forward.

While FieldProc worked well with native SQL applications in release 7.1, there were limitations for older RPG and COBOL applications, and many IBM query utilities did not work correctly with encrypted indexes. Many IBM i customers use IBM and third-party query programs for rapid development of reports and displays of data. Some customers that I’ve talked to have thousands of queries in their application mix, so limitations of IBM query with FieldProc represented an insurmountable challenge for many. When FieldProc was used to encrypt an index or key field, queries just would not work correctly and data was missing or out of order in reports and displays.

But there is some good news!

Starting with the 7.2 release of the IBM i operating system, many of the IBM query applications were updated to work with the native DB2 SQL Query Engine (SQE) by default. The SQL Query Engine has never had a problem with encrypted indexes. This means that the IBM query applications now work seamlessly with data that is encrypted with FieldProc in DB2. You can fully deploy column level encryption across multiple index columns with FieldProc, and your queries will work fine.

Many IBM i customers experimented with FieldProc in the first release in version 7.1 of the operating system and decided to take a pass. If you had that experience it is time to take another look at DB2 FieldProc encryption. The current version of the IBM i operating system is 7.3 and most IBM i customers have upgraded to this release. You now have good support for IBM queries, the SQL Query Engine, and DB2 FieldProc encryption.

While some challenges remain for older OPM and ILE RPG applications, we’ve been able to help a number of customers meet these challenges.

Encryption of data is a core part of a defense-in-depth strategy. We have to do a lot of things to protect our IBM i data, and one of those things is to encrypt the data at rest with industry standard encryption algorithms. DB2 Field Procedures provides the path to achieving this.

To read more about IBM i support for SQL Query Engine in query applications such as RUNQRY, OPNQRYF, and others, see this link.

Our Alliance AES/400 Encryption solution provides full support for DB2 Field Procedures, is easy to deploy, and affordable for every IBM i customer. 

For industry standard encryption key management you can deploy our Alliance Key Manager solution which is seamlessly integrated with DB2 Field Procedure encryption.


Key Management for IBM i - Sources of Audit Failures

Topics: Encryption, IBM i, FIELDPROC

GDPR, Right of Erasure (Right to be Forgotten), and Encryption Key Management

Posted by Patrick Townsend on Jan 22, 2018 11:11:28 AM

Download the EU Data Privacy White PaperThe European General Data Protection Regulation (GDPR) is a radical and transforming event in the information technology space. Due to go into full effect on May 25, 2018, it will require major changes to IT systems and they way organizations relate to their customers, employees, and external partners. It is hard to overstate the the impact of the regulation. Organizations of all sizes and types, and cloud service providers small and large, must adjust to the notion that people now fully own information about themselves - and companies outside of the EU zone are impacted, too.

Article 17 of the GDPR focuses on the “Right of erasure”, also known as the “Right to be forgotten”. Here is a link to that section.

Let’s talk about how we can use encryption and key management to help meet the requirements of the legislation. Since deploying encryption will also help meet the privacy requirements of GDPR, the same technology can be used to implement Right of Erasure.

First, let’s look at the technology landscape related to encryption:

Encryption is one of the most well understood mechanisms for data privacy. There are well-established, mature standards for encryption and the related key management technologies. Most companies will use encryption to meet GDPR privacy requirements, and will be deploying encryption key management to protect the keys. There are mature encryption technology solutions available on all major enterprise operating systems and on all major cloud platforms. Protecting encryption keys is also well understood. Many organizations have already deployed encryption in some parts of their organizations, and GDPR will speed this process and extend protections across all parts of the data landscape.

The hardest part of getting encryption right has to do with creating, protecting, and deploying encryption keys. It is probably the hardest part of getting an encryption strategy right - and there are a lot of ways to get key management wrong:

  • Storing the unprotected encryption key with the protected data
  • Using weak protection methods to secure encryption keys
  • Storing the encryption key directly in application code
  • Using a weak encryption key - a password is an example of a weak key
  • Not using strong, industry standard methods of generating an encryption key
  • Not providing separation of duties and dual control around key management

There are lots of ways to get encryption key management wrong - and bad key management practices will result in GDPR compliance failures.

Fortunately, it is fairly easy to deploy good encryption key management that is affordable, easy to install and configure, and easy to integrate with your encryption strategy. A number of professional key management solutions are available to serve every enterprise operating environment. We have one (Alliance Key Manager), and others are available.

Now that we have a good encryption and key management strategy in place, let’s use it to meet the GDPR Right to Erasure.

Under GDPR Article 17 a need to erase personal information can be triggered by a number of events:

  • A Data Subject (usually a person) can request erasure of personal information
  • The personal information is no longer relevant from a business perspective
  • A Data Subject withdraws consent and there is no overriding need or requirement to retain the data
  • A Data Subject withdraws consent for processing their information
  • Personal data has been unlawfully obtained or processed

That covers a lot of ground! It is not as simple as just responding to a request for erasure, we have to be aware of our actual need for information. And erasure triggers some secondary requirements:

  • The Data Controller must attempt to remove data that has been made publicly available
  • The Data Controller must inform third party Data Processors of the need to erase data

We have a lot of responsibilities under GDPR Article 17. How can we use encryption and key management to meet this requirement?

A key management approach:

Imagine that you assign a unique encryption key to each Data Subject (employee, customer, and so forth) and that you encrypt that person’s personal data in your databases with that unique and specific key. The time comes when must meet your obligations under Right of Erasure. Rather than go through every database table and storage server to delete the data, you could just delete the encryption key. Assuming you have strong encryption keys and industry standard key deletion processes, the deletion of the key is an effective way to zero the protected data without actually modifying the database. Data that is encrypted is unrecoverable if the key is no longer available.

There is one more added benefit to this approach - it effectively erases all of the data on your backups! Managing compliance with GDPR is especially difficult when it comes to off site backups of sensitive data. The ability to effectively erase data by erasing the encryption key without having to pull those backups out of storage is a huge cost and administrative saving!

The strategy described above is only defensible if you are encrypting the Data Subject’s information, if you are assigning them a unique encryption key, and if you are using an encryption key management solution that provably meets industry standards for key zeroization. Our key management solution does and you can get more information here.

We’ve touched just one aspect of GDPR. We will be talking more about GDPR in the days ahead.


EU Data Privacy Protections and Encryption

Topics: Compliance, EU GDPR

IBM i Customers Lag the Industry When It Comes to Encryption

Posted by Patrick Townsend on Dec 4, 2017 11:29:40 AM

This year we undertook a survey of organizations to determine how well they are doing in deploying encryption to protect sensitive digital assets. We were curious to see the level of progress organizations were making on this core part of a defense in depth strategy. Because we have information about both IBM i and non-IBM i users, we also wanted to see if there were differences based on the platform they used for their critical applications.

Key Management for IBM i - Audit FailuresWe received responses from over 300 technology users. We felt it was a large enough response to allow us to make some generalizations.

The results were shocking.

Approximately 80 percent of our Windows and Linux respondents had taken some concrete steps to implement encryption to protect sensitive data. In most cases they still had a ways to go to fully protect sensitive data, but that was a lot more progress than I would have imagined. We did not try to distinguish between the particular database in use to store data, but we know that it spanned commercial databases (SQL Server, Oracle, etc.) and open source databases such as MySQL, MongoDB and others.

More Windows and Linux users had made progress with encryption than I would have guessed (sorry).

The real shocker was how few IBM i organizations had made steps to deploy encryption. Only about 6 percent of IBM i users had made any progress in deploying encryption as a part of their security defense in depth. And many of these IBM i users said that they had no plans at all to use encryption for security.

My surprise arises from the fact that I think of IBM i users as generally being more diligent and deliberate around sound system management and security practices. But clearly this is not the case. So what could explain this lapse on the part of IBM i users?

We have some comment data from the survey that points to a general view that IBM i users think their systems are more secure than other platforms. This view is probably a result of IBM’s early investment in security on the platform, and historical messaging about this. Of course we know now that the IBM i platform is subject to data breaches in the same ways that Windows and Linux are, but I believe that this outdated view of the security of the platform now works against IBM i users and leads directly to avoidable data breaches.

Unfortunately, I think there is another source for this lag in security by IBM i customers. There are still too many IBM employees and IBM consultants carrying the message to customers that their systems are more secure than they actually are. I even read a recent comment by a senior IBM engineer that IBM customers should not give much attention to encryption of data on their systems. They should, instead, put more attention on access controls and system security.

This unfortunate message and the myth that the IBM i platform is so secure that you don’t need to worry about encryption is still an unfortunate reality in this community. We know that IBM i users have lost data in breaches. We know that the IBM i server can be breached through weaknesses and vulnerabilities in adjoining networked PCs and servers using classic hacking techniques. A poor implementation of a defense in depth strategy leaves IBM i customers exposed. Because the IBM i server often hosts many back-office applications with rich sources of sensitive data, it is an especially egregious lapse of security.

As a community, IBM i users must do better in deploying a proper defense in depth for their sensitive data. And hopefully thought-leaders inside IBM and outside IBM will recognize the danger of overstating the platform’s native security.


Key Management for IBM i - Sources of Audit Failures


MongoDB San Francisco local conference a big success

Posted by Patrick Townsend on Oct 21, 2017 12:50:03 PM

I just got back from the first MongoDB regional conference in San Francisco and wanted to share a few impressions. I will be blogging more about MongoDB, but here are some first impressions from the conference:

Introduction to Encrypting Data in MongoDBWe’ve had support for MongoDB encryption key management for some months now, and the conference in San Francisco last week was a real delight. This is an amazing community of developers and users who are innovating and creating new applications based on MongoDB at a rapid pace. I was happy to give the security session on encryption key management for MongoDB and it was a great success. You can find a replay here.

I spent time talking to some of the hundreds of attendees and learned a lot about how the use MongoDB. Here are just a few take-aways:

MongoDB is Easy to Use

Database development has the reputation of being difficult and requiring extensive expertise. But I met a lot of people who have developed MongoDB databases without much or any database administrative knowledge. For example, I talked to a young scientist (that was the title on her badge) who was not in an IT group at all. She had developed a MongoDB database to store and analyze genetics information from her genome sequencing project. I met a pair of IT administrators from a property management company who had developed three applications on MongoDB because their development team had a long backlog of projects. They were in the network administration group and did not have formal experience with database development, but they built applications in a short period of time. It’s truly amazing to see sophisticated applications being developed by users!

MongoDB is Replacing Traditional Relational Databases

Before last week I thought of MongoDB as a big data repository for documents and IoT data. It is that, but it is a lot more. Many of the professionals I talked to at the conference were using MongoDB the way you would use a normal relational database like SQL Server, Oracle, DB2 or MySQL. I had the definite sense that MongoDB has found a way to bridge the worlds of relational databases and unstructured big data repositories. MongoDB users told me that the APIs and toolsets let them do almost anything they wanted to do.

MongoDB Really Scales for Big Data Needs

Well, MongoDB really is a great repository for big data. I talked to a number of larger enterprises from the San Francisco Bay Area who are storing extremely large amounts of data in MongoDB databases. Medical, IoT sensor data, financial data, customer service data and other types of data from a variety of data sources are being collected in MongoDB for business operations support and business analytics. The scalability of MongoDB is truly impressive.

Sensitive Data is Going into Almost All MongoDB Databases!!!

The other thing I learned is that an awful lot of sensitive data is being stored in MongoDB. The database is very flexible, and so there tend to be a lot of data feeds into the database. And those database feeds can include sensitive data that the MongoDB developers do not know about. I heard stories about MongoDB developers being surprised to find credit card numbers, social security numbers, email addresses, medical data and a lot of personally identifiable information (PII) being stored in the database.

MongoDB developers are really aware of the risks of sensitive data in the database. And they were hungry for information on how to protect it. Fortunately MongoDB Enterprise makes it incredibly easy to implement encryption and key management. In my session I was able to show how you can enable encryption in MongoDB with just a few commands. Customers upgrading from the open source Community Edition of MongoDB get access to this encryption facility and it is a delight.

MongoDB is Doing Encryption and Key Management Right

I’ve been impressed with the MongoDB implementation of encryption and key management from the start. First, MongoDB stepped up and implemented encryption right in the MongoDB database. There is no need for any third party file or folder encryption product to sit under the database. The encryption in MongoDB is based on industry standards (256-bit AES) and is tuned for performance. This is exactly what you want - your database vendor taking responsibility for encryption and owning the performance profile.

MongoDB also got the encryption key management part right. They based the key management interface on the open OASIS standard Key Management Interoperability Protocol (KMIP) in order to immediately support a broader community of key management vendors. That made it easier for us to certify our key management solution, Alliance Key Manager, for the MongoDB Enterprise platform. We are happy to support both Intel and POWER chip architectures for MongoDB deployments.

Lastly, just a personal note. I met a lot of MongoDB staff and managers at the conference. What a great bunch of people. They were energized and positive about what they are doing. Every company has its own character, and I found myself happy that we were engaged with this group of people.


Introduction to Encrypting Data in MongoDB

Topics: MongoDB Encryption, MongoDB Encryption Key Management, MongoDB

HIPAA and Encryption - What You Need to Know

Posted by Patrick Townsend on Oct 10, 2017 8:46:26 AM

HIPAA regulations regarding data security for patients are hard for a layperson to understand, and are even difficult for administrators and technologists who work in the healthcare industry. This is especially true for smaller organizations, partly due to the complexity of the HIPAA law itself, and the HITECH regulations that followed it. Let’s try to clear up some of the misunderstanding around HIPAA and encryption, and clarify what you should do regarding data protection for patient data.

Achieve saThe first confusion about the protection of patient data is that encryption of this data is a strong recommendation, but that it is not a mandate. It is what is termed an “addressable” requirement. The word “addressable” has very specific meaning in the context of HIPAA. If implementing encryption of patient data is not feasible, a healthcare organization under HIPAA regulations, can implement equivalent protections. So, if your software vendor or IT department thinks that encryption is not feasible you have the option to implement other equivalent security controls to compensate for that. The reasons why you think it is not feasible must be documented in writing, and must be reasonable and valid.

Encryption is not a mandate under HIPAA law. And unless the law changes, it is probably not possible for HHS and its Office for Civil Rights (OCR) to make it mandatory.

But there is much more that you need to know. While HHS and OCR cannot mandate the encryption of patient data, they do have the ability to make it painful if you don’t. And that is exactly what they are doing. For example, if you claim that you can’t encrypt patient data, document your reasons, implement compensating controls, and THEN have a data breach, you are likely to be penalized for the lack of effectiveness of the compensating controls. Your data breach is clear evidence that your compensating controls were inadequate.

I like to call this a “Backdoor encryption requirement”. That is, there is probably nothing you can do in the way of compensating controls that are equivalent to encryption. But you won’t discover that until you have a data breach.

Lacking the ability to mandate encryption, HHS and OCR have taken to the strategy of increasing the penalties for lost patient data. I’ve heard recently from many organizations in the healthcare segment of increasing concern about the potential fines related to a data breach. This is driving a new interest in encryption and the related requirement to protect encryption keys.

This last point is crucial when implementing encryption for HIPAA compliance - your encryption strategy is only as good as your encryption key management strategy. Encryption keys are the secret that has to be protected. If you lose the encryption key, the cybercriminals have the ability to access patient data. Storing encryption keys in a file, in application code, or on mountable drives or USB storage will certainly fail a best practices review. Use a professional, certified key management solution in your encryption deployment to protect patient data.

If you are going to do encryption of patient data, get it right the first time! Use good key management practices.


Achieve Safe-Harbor Status from HIPAA Breach Notification

Topics: Encryption Key Management, HIPAA, AES Encryption



Subscribe to Email Updates

Recent Posts

Posts by Topic

see all