Townsend Security Data Privacy Blog

The Top 10 Encryption Pitfalls

Posted by Luke Probasco on Mar 22, 2011 9:02:00 AM

encryption mistakes

As compliance regulations start mandating encryption and key management, we are seeing more and more companies stepping up their data security policies.  One important thing to realize is, that just because you are implementing encryption, it doesn’t necessarily mean that you are doing it correctly and will meet regulations such as PCI DSS, HIPAA/HITECH, State Privacy Laws, etc.

We have compiled a list of the top ten encryption pitfalls that your enterprise needs to be aware of.

 

1) Encryption Key Management

Encryption requires a proper key management strategy. This means protecting and isolating encryption keys from the data that they protect. For most companies this means using a proper key management solution across all of their servers and applications.  Townsend Security offers Alliance Key Manager to help meet key management and compliance regulations.

 

2) Completeness and Compatibility

It’s not uncommon for some encryption solutions to only implement a partial specification of AES encryption. There are nine encryption modes (five for business data) that can be used with AES encryption. An incomplete solution that encrypts with one mode — such as CBC — will leave you unable to decrypt with another mode like ECB. This incompatibility makes transferring encrypted data from one server to another difficult or impossible.  Townsend Security’s Alliance AES Encryption is NIST-certified on all five modes for business data.

 

3) NIST Certification

As regulators refine the requirements for encryption and key management, the certification of products to NIST standards is more important. The recent 2009 HITECH Act makes specific reference to the NIST standards for encryption and key management. Many vendors of encryption solutions ignore NIST certification leaving their customers exposed to these evolving regulations.

 

4) Performance

The impact of encryption on servers and applications is often an unpleasant surprise as companies implement their data security plans. There are large differences in the performance of vendor solutions. The performance impact of encryption can delay or derail data security efforts.

 

5) Application Modifications

Implementing encryption at the database level often involves some application redesign and modification. This requires work by companies and their vendors. This work is often unplanned and unbudgeted, causing financial and human resource problems.  It is important to make sure your application modifications are minimal.

 

6) Quality Assurance and UAT Testing

When applications and databases are modified to implement encryption, there is a need to re-certify them for accuracy, reliability and performance. Many companies find this effort larger than the effort to implement encryption.

 

7) Data Leakage to QA and Test Environments

Every company that maintains business applications must keep a set of data available to the developer and user acceptance teams so that changes can be adequately tested. Often the data used in these test environments contains sensitive information. Good practice requires proper protection of this information using encryption, masking, or tokenization.

 

8) System and Compliance Logging

A common question asked by auditors is “How do you know who decrypted a credit card number?” Unless your encryption solution has integrated compliance logging, you may not know who is viewing sensitive data in your database systems. Compliance logging is often overlooked by vendors of encryption systems, leaving companies perplexed in the event of a data loss.  Townsend Security offers Alliance LogAgent for the IBM i or Syslog-ng as both an application or appliance.

 

9) Key Access Controls

Encryption and key management access controls are essential to an encryption strategy. Can you specify who has access to the HR encryption key for payroll processing? The ability to restrict the use of encryption to specific users and groups is an essential security control.

 

10) Virtual and Cloud Platforms

Encryption and Key Management in VM and Cloud environments pose special challenges. The PCI SSC virtualization group indicates that security concerns are much higher in these environments.  Currently there is no standard for implementing key management in the cloud environment.

In conclusion, there are many factors involved when choosing the right encryption and key management solution for your enterprise.  Additionally, once chosen, it is also important to make sure that it is implemented correctly.  For more reading on encryption and PCI, we have written a white paper titled Encryption Key Management Requirements for PCI.

Click me

Topics: PCI DSS, Encryption Key Management, AES Encryption

Migrating to Alliance Key Manager with IBM i Native Encryption APIs

Posted by Patrick Townsend on Mar 7, 2011 11:10:00 AM
Key ManagementNow that the new version of the PCI Data Security Standard (PCI DSS version 2.0) is in effect, many IBM i (AS/400, iSeries) customers are getting dinged on their PCI compliance in the area of encryption key management. The renewed focus on "Dual Control" and "Separation of Duties" by QSA auditors is forcing many IBM i customers to move from homegrown key management to a better method of securing keys. This is even happening for IBM i customers who use IBM’s Master Key and key database facility. Why is this? There is just no way to properly implement effective security controls for the QSECOFR user, or for any user with All Object (*ALLOBJ) authority. Thus no "Dual Control" and no "Separation of Duties." And QSA auditors have figured this out.

Moving to good key management does not mean you have to completely change how you encrypt the data. And it doesn’t have to be a time consuming, laborious process. Many IBM i customers use the native IBM i encryption APIs to protect data. Let us show you how easy it is to implement our Alliance Key Manager solution in RPG code while maintaining your encryption approach.

When you use the native IBM i APIs you first create an encryption algorithm context, then a key context, and they you use these contexts on the call to the encryption or decryption API. If you are using the IBM Master Key facility and special key database, you pass additional parameters to the key context API. Before migrating to our Alliance Key Manager solution your RPG code might look something like this:

      * Create a key context
     C                   eval      myKey = 'some binary value'
     C                   eval      keySize = 32
     C                   eval      keyFormat = '0'
     C                   eval      keyType = 22
     C                   eval      keyForm = '0'
     C                   callp     CrtKeyCtx( myKey      :keySize :'0'
     C                                       :keyType    :keyForm :*OMIT
     C                                       :*OMIT      :KEYctx  :QUSEC)
       *
       * Now we call Qc3EncryptData or QC3ENCDT to encrypt some data
       * and pass it the key context field <KEYctx>

After you implement the Alliance Key Manager solution and the IBM i API to retrieve the key, your application code would look like this:

      * Get the key from Alliance Key Manager
     C                   eval      AKMName = 'SomeKeyName'
     C                   eval      AKMInstance = ' '
     C                   eval      AKMSize = 256
     C                   eval      AKMFormat = 1
     C                   callp     GetKey( AKMName       :AKMInstance
     C                                       :AKMSize    :AKMFormat
     C                                       :AKMKey     :AKMUsed
     C                                       :Expires    :LastChange
     C                                       :Reply)
      *
      * Now we can use the field <AKMKey> on the create of the key context
      *
      * Create a key context
     C                   eval      keySize = 32
     C                   eval      keyFormat = '0'
     C                   eval      keyType = 22
     C                   eval      keyForm = '0'
     C                   callp     CrtKeyCtx( AKMKey      :keySize :'0'
     C                                       :keyType    :keyForm :*OMIT
     C                                       :*OMIT      :KEYctx  :QUSEC)
       *
       * Now we call Qc3EncryptData or QC3ENCDT to encrypt some data
       * and pass it the key context field <KEYctx>. That code is unchanged.

Notice that you’ve added a few lines of code to retrieve the key from the key server, and then used the retrieved key to create the key context. For most IBM i customers this will be a very quick change involving just a few lines of code. If you’ve taken a common module approach to isolate the encryption code, this might mean changing just one or two applications on your system. If you are using the IBM i Master Key and key database facility, you will have one more step to re-encrypt the data using keys from the Alliance Key Manager server.

Pretty simple process. Not bad for a day’s work.

Of course, there are proper ways to manage and protect an encryption key that has been retrieved from a key server, but we won’t go into that here. I want to save that topic for another day as it applies to many different application environments.

I hope you’ve gotten the idea that good key management doesn’t have to be a difficult, scary process. We are helping customers get this done today, and you can get there, too.

Click here to learn more about Alliance Key Manager and request an evaluation today.

Patrick

Topics: IBM i, PCI DSS, Encryption Key Management

Non-Standard Encryption – Now That Bites

Posted by Patrick Townsend on Feb 11, 2011 1:33:00 PM

CUSP EncryptionIn our encryption practice we often help customers integrate the exchange of encrypted data between different applications within the organization, and between their own applications and a vendor’s or customer’s application. It is truly amazing to me how often we encounter non-standard encryption that makes this integration very difficult. The problem is not the lack of standards for encryption. Most compliance regulations provide clear guidance and references to encryption standards. Here is what the PCI Data Security Standard (PCI DSS) Navigation Guide says about encryption (my emphasis):

The intent of strong cryptography (see definition and key lengths in the PCI DSS and PA-DSS Glossary of Terms, Abbreviations, and Acronyms) is that the encryption be based on an industry-tested and accepted algorithm (not a proprietary or "home-grown" algorithm).

Strong Encryption:
Cryptography based on industry-tested and accepted algorithms, along with strong key lengths and proper key-management practices. Cryptography is a method to protect data and includes both encryption (which is reversible) and hashing (which is not reversible, or “one way”). Examples of industry-tested and accepted standards and algorithms for encryption include AES (128 bits and higher), TDES (minimum double-length keys), RSA (1024 bits and higher), ECC (160 bits and higher), and ElGamal (1024 bits and higher). See NIST Special Publication 800-57 for more information.

The problem seems to be a general lack of knowledge about encryption. And this applies to some vendors of encryption solutions. Here are a couple of examples:

One of our customers was having trouble decrypting a field with our software that was encrypted on a Windows server with “256-bit AES using CBC mode”. That seemed to be a pretty straight-forward task. Yet we couldn’t get the data decrypted properly. The output just looked like garbage. We spent a fair about of time with the customer verifying that they had the right decryption key, that the initialization vectors for the decryption were specified correctly, and that our software was being used correctly. But nothing was working. We then asked the third party software vendor to share their AES source code with us.  In this case the vendor was very accommodating and let us review their source code implementation of AES encryption.

Voila! The source code showed that the implementation was using a 256-bit block size for the encryption algorithm. The AES standard (FIPS-197) requires the use of 128-bit block sizes. The use of the larger block size meant that this was not even AES encryption according to the industry standard. The vendor fixed their implementation and the project was successful. But our customer spent a lot of time and money resolving the problem.

Another example of getting into trouble occurred with a customer who deployed an AES encryption solution that used the CUSP mode of encryption. This rang alarm bells right away.  We knew that CUSP was not one of the NIST approved modes of encryption, and we had never encountered it before. We quickly learned that CUSP stood for “Cryptographic Unit Service Provider” and was implemented by IBM in a couple of their server products. This customer had a couple of problems. CUSP mode was not a standard mode of encryption, and data encrypted with CUSP was not going to be decrypted by any of the standard open source or commercial products in the market. So this customer was locked into an incompatible encryption strategy.

The second problem is that the CUSP mode of encryption is a proprietary protocol.  The PCI DSS clearly states that strong encryption must be based on industry standards and not proprietary protocols (see above).  As I interpret the PCI DDS requirements for encryption, a customer using CUSP mode would be out of compliance. That’s going to hurt the next time a QSA takes a hard look at your encryption implementation.  We recently published a white paper on the weaknesses of the CUSP mode of encryption.  Click here to download it.

One way to insure that your AES encryption software is compliant is to look for a NIST certification. A NIST AES Validation certificate, or a NIST FIPS-140 certificate, is pretty good assurance of compliance. The FIPS-140 certification process requires AES Validation, so that certification is incorporated by reference. That’s why either certification will give you the assurance that AES encryption is being done according to the standard. Lacking certification, you are relying on the promises of a vendor or developer who may be ignorant, or have a motivation to be less than forthcoming. Not a good situation to find yourself in.

Both of these customers spent a fair amount of money fixing the problem.  An entirely avoidable expense.

Patrick

Topics: Encryption, NIST, CUSP, PCI DSS, AES, PCI, FIPS-140

Encryption Key Management: Top IT Initiative

Posted by John Earl on Feb 11, 2011 1:13:00 PM

encryption key managementI just returned from a trip to Europe and Encryption Key Management was a very hot topic.  This is a topic I very much like to speak about, given our recent release of Alliance Key Manager.  It still surprises me how many conversations I had with technology companies who understood the need to have a proper key manager either embedded within or integrated from the outside of their program or appliance.  There are, I think, a couple of reasons for this phenomena.  

First, many organizations are taking the step to encrypt sensitive data that used to be stored in the clear.  Protecting data is an important IT initiative these days, and one of the absolute best ways to protect data is to encrypt that data.  But as IT teams take on their encryption initiatives, somewhere in the middle of their first encryption project an important realization dawns upon them: After you encrypt the data, the data is only safe if you protect the encryption key.  At this point some organizations will put a temporary fix in place and "hide" the keys as best they can on the same server as the data, but they know this is wholly unsuitable and that a more secure and more permanent solution must be found.

The second reason that I think key management has become such a hot topic on this trip is because of the increased number of compliance regulations around encryption key management.  In October of 2010 the PCI-DSS 2.0 standard was released and in it is call for organizations that store credit card information to use a certified key management solution that is separated from the data, includes robust auditing capability, and supports separation of duties and dual control (more on those topics perhaps in another blog post).

From my perspective then, we appear to have just the right solution at just the right time.  Having recently received our FIPS-140-2 certification for Alliance Key Manager in the U.S. Mail, we're in a celebratory mood here at Townsend Security and it is good to hear all our friends in Europe endorse the time and effort our team has put into this fabulous offering.

John Earl

Topics: Alliance Key Manager, Separation of Duties, PCI DSS, Encryption Key Management, FIPS-140, Dual Control

Encryption, Key Management and Tokenization - The Trifecta

Posted by Patrick Townsend on Feb 11, 2011 11:47:00 AM

encryption, tokenizaiton, & key managementCompliance regulations are moving inexorably towards requiring the protection of sensitive data. The private information of customers, employees, patients, vendors and all of the people we come into contact with as Enterprises, must be protected from loss and misuse. At the same time that regulations are getting more teeth, there is more consensus about the technologies to protect data in our applications and databases. Encryption and tokenization are now the accepted methods of protecting data, and encryption key management is central to both technologies.

How fast are regulations changing? Really fast. The Payment Card Industry Security Standards Council will update the PCI Data Security Standard (PCI DSS) this year, and will be on a three year cycle of updates. Merchants accepting credit cards will have about 18 months to implement the changes. State privacy laws are undergoing frequent changes, most of which make the rules more stringent. Minnesota, Nevada, and Washington State have made recent changes. The HITECH Act of 2009 and related guidance further tightens the rules around protecting patient data, and further guidance is expected this year. Last, but not least, the federal government is moving new legislation through Congress to enact a national privacy law.

These changes are coming fast, and they have one thing in common: data protection requirements are getting stronger, not weaker. Companies and organizations should be paying attention to their data protection strategies now in order to avoid expensive rip-and-tear operations in the future.

One other tendency of the evolving regulations is this: A clear reference to standards for data protection. All of the mentioned standards now make reference to widely accepts standards, usually those of the National Institute of Standards and Technology (NIST) which publishes standards and testing protocols for encryption and key management. Over the last two years PCI (and related guidance from Visa), the HITECH Act, state privacy laws, and other regulations have specifically referenced NIST for data encryption and key management standards.

Companies and organizations acquiring data protection technologies should look carefully at how solutions match up to the standards. And a word of warning here: There is still a lot of snake oil in the data protection industry. Be sure that your data protection vendor can prove that their solutions actually meet the NIST standards. This is actually not hard to independently verify – NIST publishes on-line lists of vendors who certify their solutions to the standard.

Encryption is a well defined technology to protect data through the use of an encryption algorithm and secret key. When combined with proper key management, encryption provides a well accepted method of protecting sensitive data. There is a long history of work by professional cryptographers and NIST on defining how good encryption and key management should work, and you can easily determine which vendors meet the standard through the certification process.

Tokenization is a new technology and lacks the history and standards of encryption, but which incorporates encryption technologies. Tokenization works by substituting a surrogate value (or “token”) for the original data. By itself the token does not tell you anything about the original value which might be a credit card number, patient ID, and so forth. But tokenization requires that you use good encryption practices to protect the sensitive data in the token database. This means you have to use a tokenization solution that meets the same stringent standards for encryption and key management. When acquiring a tokenization solution, you will want to use the same diligence about encryption and key management that you would use for a pure encryption solution – that is, the solution should be built to standards and the vendor should be able to prove it through the NIST certification process.

Remember, a tokenization solution will be IN SCOPE for a PCI audit!

Tokenization standards are still evolving. Bob Russo of the PCI Security Standards Council indicated that the council will be taking up work on this in 2010. Visa just released a best practices guide for tokenization (you can get it here), and you can probably expect the eventual standards to incorporate much of this guidance. Additionally, the X9 organization is also working on standards for tokenization.

In regards to tokenization standards, stay tuned ! Much more is coming our way.

Encryption, tokenization, and key management – this is the trifecta for protecting data at rest. I’ll have more comments in the future about tokenization as we analyze the best practice guidance from Visa and help you connect the dots with our encryption, tokenization, and key management solutions.

Patrick

Topics: Encryption, NIST, Key Management, PCI DSS, tokenization

PCI DSS 2.0 and Encryption Key Management

Posted by Patrick Townsend on Feb 11, 2011 11:46:00 AM

2014 UPDATE:
No Significant Changes around Key Management in PCI DSS v3.0

PCI DSS 2.0 encryption

The new PCI Data Security Standards (PCI DSS v2.0) are here and I’ve gotten a lot of questions about the changes related to encryption key management. Because we work with a lot of companies going through PCI compliance audits and reviews, the new standards just confirm the trends we’ve seen over the last few months on how QSA auditors and security professionals view encryption key management, and what they see as the minimum requirements for managing keys.  The clear trend is to require that encryption keys be stored separately from the data they protect, and to make sure that the people who manage encryption keys are not the people who manage the protected data. Let’s look at why this is happening.

PCI DSS Encryption Key Management Compliance While most of the largest merchants in the Level 1 category are already using professional key management solutions to protect encryption keys, the trend over the last 12 months is to require smaller merchants in the Level 2 and Level 3 categories to also use better key management practices, too. So, what are the parts of PCI DSS that are driving this change?  It all has to do with industry best practices for encryption key management, and the concepts of Dual Control, Separation of Duties, and Split Knowledge. These best practices and concepts work together to form the basis for determining if your approach to key management will pass muster.

First, what is the source of industry best practices for key management? Here in the US, the National Institute for Standards and Technology (NIST) is the most common source for guidance on best practices. The NIST special publication SP-800-57 provides specific pointers on how best practices for both procedurally managing encryption keys, and what to look for in key management systems. In these documents you will find the genesis of most standards regarding encryption key management, including the concepts in PCI DSS 2.0 Section 3.

Next, it is important to understand Dual Control, Separation of Duties, and Split Knowledge. These are all clearly defined in the PCI DSS standard and in the accompanying PCI DSS glossary. I’ve extracted the exact definitions below, but I’ll recap them here from the point of view of key management.

Dual Control means that no one person should be able to manage your encryption keys. Creating, distributing, and defining access controls should require at least two individuals working together to accomplish the task.

Separation of Duties means that different people should control different aspects of your key management strategy. This is the old adage “don’t put your eggs in one basket”. The person who creates and manages the keys should not have access to the data they protect. And, the person with access to protected data, should not be able to manage encryption keys.

Split Knowledge applies to the manual generation of encryption keys, or at any point where encryption keys are available in the clear. More than one person should be required to constitute or re-constitute a key in this situation.

What are the practical implications of these best practices and core concepts?  One of the practical implications follows from a common fact of system administration. On all major operating systems such as Linux, Windows, and IBM System I (AS/400) there is one individual who has the authority to manage all processes and files on the system. This is the Administrator on Windows, the root user on Linux and UNIX, and the security officer on the IBM System i platform. In fact, there are usually multiple people who have this level of authority. In one study by PowerTech, the average IBM System i customer had 26 users with this level of authority!

That’s why storing encryption keys on the same system where the protected data resides violates all of the core principles of data protection, and that’s why we are seeing auditors and payment networks reject this approach. If you haven’t faced this issue yet, your day is probably coming. Now is the time to start planning on how to deal with the problem.

Over two years ago we saw this trend developing and took action to help merchants be prepared for proper key management. We created the Alliance Key Manager solution and released it to our partner channel in 2009. This year we released it for direct sale, and last week we received our FIPS-140-2 certification from NIST. Over 1,000 customers are now using AKM to protect their encryption keys with a solution that provably meets industry standards.  Our encryption products have been updated to use this new key management solution, and we are moving customers forward to compliance. It’s been a long, hard slog to NIST FIPS-140 certification, but I think our customers will benefit from the effort.

I hope this has been helpful in clarifying key management best practices. For more information on PCI and key management, download our podcast titled "Key Management Best Practices: What New PCI Regulations Say." Please let us know if you have any questions.

Click me

---

From the PCI DSS version 2.0 Glossary:

Dual control
“Process of using two or more separate entities (usually persons) operating in concert to protect sensitive functions or information. Both entities are equally responsible for the physical protection of materials involved in vulnerable transactions. No single person is permitted to access or use the materials (for example, the cryptographic key). For manual key generation, conveyance, loading, storage, and retrieval, dual control requires dividing knowledge of the key among the entities. (See also Split Knowledge).”


Separation of Duties
“Practice of dividing steps in a function among different individuals, so as to keep a single individual from being able to subvert the process.”

Split knowledge
“Condition in which two or more entities separately have key components that individually convey no knowledge of the resultant cryptographic key.”

Source documents are available online at www.pcisecuritystandards.org

Topics: Compliance, Encryption, Key Management, PCI DSS, PCI, FIPS-140