Townsend Security Data Privacy Blog

PCI SSC Issues New Tokenization Guidance

Posted by Patrick Townsend on Aug 12, 2011 12:39:00 PM

pci tokenizationToday the Payment Card Industry Security Standards Council issued some long awaited guidance on tokenization as a method of protecting credit card information (“PAN”, or Primary Account Number in PCI language). This guidance has taken a number of months to produce, and will be very helpful to merchants who want to deploy tokenization solutions to reduce the scope of PCI compliance efforts, for those who have already deployed solutions and need to assess if they meet minimum requirements, and for QSA auditors who have been looking for some guidance in this area.

The link to the guidance is here.

Here are some first thoughts on reading the guidance.

As I’ve been saying here before, a tokenization solution is always in scope for PCI DSS compliance. This means the merchant is responsible for insuring that a tokenization solution itself meets all of the PCI DSS requirements. Whether you create your own tokenization application, or you acquire one from a vendor, it is in scope for PCI DSS compliance and you are responsible for it. The guidance makes this point very clearly. In fact, the guidance makes the observation that tokenization solutions are likely to be the targets of attack because they become a centralized repository of credit card information. Merchants should be very careful that their tokenization solutions meet all PCI DSS requirements and are sufficiently hardened. In my opinion, there are very few merchants who would be up to the task of creating such a solution in-house.


Following on the previous point, a tokenization solution must have encryption and key management capabilities that meet PCI DSS requirements. This means special care in the deployment and implementation of key management. You should be very sure that your key management solution implements proper Dual Control, Separation of Duties, Split Knowledge, and separation of keys from protected data. When acquiring a vendor solution for tokenization look for FIPS-140-2 certification of the key management system – that should be a minimum requirement to indicate the proper implementation of encryption and controls.

Download a recent webcast I recorded that discusses the importance of tokenization and PCI compliance.

view-tokenization-compliance-webinar

Today’s announcement by the PCI Security Standards Council on tokenization is certain to raise awareness about the importance of selecting the right tokenization solution for your organization and ensuring it meets the PCI DSS requirements.  I will be post another article Tuesday that will discuss more thoughts on this new guidance.

I hope you find this helpful.


Patrick

 

Topics: tokenization, PCI, PCI SSC

Non-Standard Encryption – Now That Bites

Posted by Patrick Townsend on Feb 11, 2011 1:33:00 PM

CUSP EncryptionIn our encryption practice we often help customers integrate the exchange of encrypted data between different applications within the organization, and between their own applications and a vendor’s or customer’s application. It is truly amazing to me how often we encounter non-standard encryption that makes this integration very difficult. The problem is not the lack of standards for encryption. Most compliance regulations provide clear guidance and references to encryption standards. Here is what the PCI Data Security Standard (PCI DSS) Navigation Guide says about encryption (my emphasis):

The intent of strong cryptography (see definition and key lengths in the PCI DSS and PA-DSS Glossary of Terms, Abbreviations, and Acronyms) is that the encryption be based on an industry-tested and accepted algorithm (not a proprietary or "home-grown" algorithm).

Strong Encryption:
Cryptography based on industry-tested and accepted algorithms, along with strong key lengths and proper key-management practices. Cryptography is a method to protect data and includes both encryption (which is reversible) and hashing (which is not reversible, or “one way”). Examples of industry-tested and accepted standards and algorithms for encryption include AES (128 bits and higher), TDES (minimum double-length keys), RSA (1024 bits and higher), ECC (160 bits and higher), and ElGamal (1024 bits and higher). See NIST Special Publication 800-57 for more information.

The problem seems to be a general lack of knowledge about encryption. And this applies to some vendors of encryption solutions. Here are a couple of examples:

One of our customers was having trouble decrypting a field with our software that was encrypted on a Windows server with “256-bit AES using CBC mode”. That seemed to be a pretty straight-forward task. Yet we couldn’t get the data decrypted properly. The output just looked like garbage. We spent a fair about of time with the customer verifying that they had the right decryption key, that the initialization vectors for the decryption were specified correctly, and that our software was being used correctly. But nothing was working. We then asked the third party software vendor to share their AES source code with us.  In this case the vendor was very accommodating and let us review their source code implementation of AES encryption.

Voila! The source code showed that the implementation was using a 256-bit block size for the encryption algorithm. The AES standard (FIPS-197) requires the use of 128-bit block sizes. The use of the larger block size meant that this was not even AES encryption according to the industry standard. The vendor fixed their implementation and the project was successful. But our customer spent a lot of time and money resolving the problem.

Another example of getting into trouble occurred with a customer who deployed an AES encryption solution that used the CUSP mode of encryption. This rang alarm bells right away.  We knew that CUSP was not one of the NIST approved modes of encryption, and we had never encountered it before. We quickly learned that CUSP stood for “Cryptographic Unit Service Provider” and was implemented by IBM in a couple of their server products. This customer had a couple of problems. CUSP mode was not a standard mode of encryption, and data encrypted with CUSP was not going to be decrypted by any of the standard open source or commercial products in the market. So this customer was locked into an incompatible encryption strategy.

The second problem is that the CUSP mode of encryption is a proprietary protocol.  The PCI DSS clearly states that strong encryption must be based on industry standards and not proprietary protocols (see above).  As I interpret the PCI DDS requirements for encryption, a customer using CUSP mode would be out of compliance. That’s going to hurt the next time a QSA takes a hard look at your encryption implementation.  We recently published a white paper on the weaknesses of the CUSP mode of encryption.  Click here to download it.

One way to insure that your AES encryption software is compliant is to look for a NIST certification. A NIST AES Validation certificate, or a NIST FIPS-140 certificate, is pretty good assurance of compliance. The FIPS-140 certification process requires AES Validation, so that certification is incorporated by reference. That’s why either certification will give you the assurance that AES encryption is being done according to the standard. Lacking certification, you are relying on the promises of a vendor or developer who may be ignorant, or have a motivation to be less than forthcoming. Not a good situation to find yourself in.

Both of these customers spent a fair amount of money fixing the problem.  An entirely avoidable expense.

Patrick

Topics: Encryption, NIST, CUSP, PCI DSS, AES, PCI, FIPS-140

PCI DSS 2.0 and Encryption Key Management

Posted by Patrick Townsend on Feb 11, 2011 11:46:00 AM

2014 UPDATE:
No Significant Changes around Key Management in PCI DSS v3.0

PCI DSS 2.0 encryption

The new PCI Data Security Standards (PCI DSS v2.0) are here and I’ve gotten a lot of questions about the changes related to encryption key management. Because we work with a lot of companies going through PCI compliance audits and reviews, the new standards just confirm the trends we’ve seen over the last few months on how QSA auditors and security professionals view encryption key management, and what they see as the minimum requirements for managing keys.  The clear trend is to require that encryption keys be stored separately from the data they protect, and to make sure that the people who manage encryption keys are not the people who manage the protected data. Let’s look at why this is happening.

PCI DSS Encryption Key Management Compliance While most of the largest merchants in the Level 1 category are already using professional key management solutions to protect encryption keys, the trend over the last 12 months is to require smaller merchants in the Level 2 and Level 3 categories to also use better key management practices, too. So, what are the parts of PCI DSS that are driving this change?  It all has to do with industry best practices for encryption key management, and the concepts of Dual Control, Separation of Duties, and Split Knowledge. These best practices and concepts work together to form the basis for determining if your approach to key management will pass muster.

First, what is the source of industry best practices for key management? Here in the US, the National Institute for Standards and Technology (NIST) is the most common source for guidance on best practices. The NIST special publication SP-800-57 provides specific pointers on how best practices for both procedurally managing encryption keys, and what to look for in key management systems. In these documents you will find the genesis of most standards regarding encryption key management, including the concepts in PCI DSS 2.0 Section 3.

Next, it is important to understand Dual Control, Separation of Duties, and Split Knowledge. These are all clearly defined in the PCI DSS standard and in the accompanying PCI DSS glossary. I’ve extracted the exact definitions below, but I’ll recap them here from the point of view of key management.

Dual Control means that no one person should be able to manage your encryption keys. Creating, distributing, and defining access controls should require at least two individuals working together to accomplish the task.

Separation of Duties means that different people should control different aspects of your key management strategy. This is the old adage “don’t put your eggs in one basket”. The person who creates and manages the keys should not have access to the data they protect. And, the person with access to protected data, should not be able to manage encryption keys.

Split Knowledge applies to the manual generation of encryption keys, or at any point where encryption keys are available in the clear. More than one person should be required to constitute or re-constitute a key in this situation.

What are the practical implications of these best practices and core concepts?  One of the practical implications follows from a common fact of system administration. On all major operating systems such as Linux, Windows, and IBM System I (AS/400) there is one individual who has the authority to manage all processes and files on the system. This is the Administrator on Windows, the root user on Linux and UNIX, and the security officer on the IBM System i platform. In fact, there are usually multiple people who have this level of authority. In one study by PowerTech, the average IBM System i customer had 26 users with this level of authority!

That’s why storing encryption keys on the same system where the protected data resides violates all of the core principles of data protection, and that’s why we are seeing auditors and payment networks reject this approach. If you haven’t faced this issue yet, your day is probably coming. Now is the time to start planning on how to deal with the problem.

Over two years ago we saw this trend developing and took action to help merchants be prepared for proper key management. We created the Alliance Key Manager solution and released it to our partner channel in 2009. This year we released it for direct sale, and last week we received our FIPS-140-2 certification from NIST. Over 1,000 customers are now using AKM to protect their encryption keys with a solution that provably meets industry standards.  Our encryption products have been updated to use this new key management solution, and we are moving customers forward to compliance. It’s been a long, hard slog to NIST FIPS-140 certification, but I think our customers will benefit from the effort.

I hope this has been helpful in clarifying key management best practices. For more information on PCI and key management, download our podcast titled "Key Management Best Practices: What New PCI Regulations Say." Please let us know if you have any questions.

Click me

---

From the PCI DSS version 2.0 Glossary:

Dual control
“Process of using two or more separate entities (usually persons) operating in concert to protect sensitive functions or information. Both entities are equally responsible for the physical protection of materials involved in vulnerable transactions. No single person is permitted to access or use the materials (for example, the cryptographic key). For manual key generation, conveyance, loading, storage, and retrieval, dual control requires dividing knowledge of the key among the entities. (See also Split Knowledge).”


Separation of Duties
“Practice of dividing steps in a function among different individuals, so as to keep a single individual from being able to subvert the process.”

Split knowledge
“Condition in which two or more entities separately have key components that individually convey no knowledge of the resultant cryptographic key.”

Source documents are available online at www.pcisecuritystandards.org

Topics: Compliance, Encryption, Key Management, PCI DSS, PCI, FIPS-140

XML, Web Services, and Encryption

Posted by Patrick Townsend on Dec 15, 2010 11:29:00 AM

XML, Web Services, EncryptionOne clear direction I’ve observed over the last few months is the focus of QSA auditors and other security professionals on the protection of sensitive data AFTER it traverses the Internet and then lands in a database on a hard disk drive. We have really good ways of protecting data in transit using 128-bit SSL encryption. For example, the web protocols HTTPS and FTPS provide for the ability to encrypt the data in transit, and Secure Shell SSH also provides strong encryption. But after the data reaches the end point of its journey it lands on a hard drive somewhere, and it is often exposed to loss at that point. I believe that’s why security auditors are putting a lot of emphasis now on making sure that data is encrypted when it hits a hard drive.

Many companies have implemented web services in combination with the XML data standard to take advantage of low cost, real time integration with their customers and vendors. When you combine the ubiquity of the web HTTPS protocol with the W3C XML standard you get a power incentive to use this platform for business integration.
 
But care should be given to what happens to data when it leaves the realm of encrypted transit and lands on server hard drives.

Of course, the right thing to do is encrypt sensitive data before it lands on the hard drive. This means that the tools you are using have to support encryption as a natural part of the process of converting XML data. Standard XML processing tools such as Xerces and Xpath do not have built-in encryption. The same is true for XML toolkits and APIs provided by IBM, Microsoft, and others. This leaves it to developers to try to intercept data after it is transformed from XML and before it lands in a database table or on a hard drive. That’s a real challenge.

In our Alliance XML/400 web services product on the IBM platform we built encryption right into the data transformation process about four years ago. Alliance XML/400 customers can protect sensitive data by just enabling the encryption option on a translation map. The solution does the rest. The data is encrypted before insertion into the database and there is no exposure as the data lands in the database on the hard drive. Our customers are taking advantage of this feature to meet PCI and other compliance regulations.

For non-IBM System i environments we now provide an easy way to retrieve encryption keys and perform encryption in a variety of development languages such as Microsoft .NET, Java, and C/C++.

Encryption can help protect against another common threat, too. At the annual PCI SSC standards council meeting in Orlando this year, forensics expert Chris Novak of Verizon talked about how more than 75 percent of data loss events begin with a well known weakness that hasn’t been patched, and half of these are based on SQL injection attacks. With SQL injection, the attack on your servers starts with bad data inserted into a database in the clear, leaving open a later exploit. There are ways to prevent SQL injection through programming techniques, but encryption will also help defeat them.

Will encrypting your data provide all of the security protection you need? Certainly not. I like to think of it this way:  Wearing a parachute on a skydiving expedition is no guarantee that you won’t be hurt when you land.  But that doesn’t mean you wouldn’t wear one, right? I think of encryption in the same way.

To view a replay of a recent webinar we presented on XML & Web Services, click here.

Patrick

Topics: Encryption, HTTPS, HITECH, HIPAA, AES, PCI, SFTP, web services, XML, FTPS, SSL/TLS, SSL