Townsend Security Data Privacy Blog

Encryption, Key Management and Tokenization - The Trifecta

Posted by Patrick Townsend on Feb 11, 2011 11:47:00 AM

encryption, tokenizaiton, & key managementCompliance regulations are moving inexorably towards requiring the protection of sensitive data. The private information of customers, employees, patients, vendors and all of the people we come into contact with as Enterprises, must be protected from loss and misuse. At the same time that regulations are getting more teeth, there is more consensus about the technologies to protect data in our applications and databases. Encryption and tokenization are now the accepted methods of protecting data, and encryption key management is central to both technologies.

How fast are regulations changing? Really fast. The Payment Card Industry Security Standards Council will update the PCI Data Security Standard (PCI DSS) this year, and will be on a three year cycle of updates. Merchants accepting credit cards will have about 18 months to implement the changes. State privacy laws are undergoing frequent changes, most of which make the rules more stringent. Minnesota, Nevada, and Washington State have made recent changes. The HITECH Act of 2009 and related guidance further tightens the rules around protecting patient data, and further guidance is expected this year. Last, but not least, the federal government is moving new legislation through Congress to enact a national privacy law.

These changes are coming fast, and they have one thing in common: data protection requirements are getting stronger, not weaker. Companies and organizations should be paying attention to their data protection strategies now in order to avoid expensive rip-and-tear operations in the future.

One other tendency of the evolving regulations is this: A clear reference to standards for data protection. All of the mentioned standards now make reference to widely accepts standards, usually those of the National Institute of Standards and Technology (NIST) which publishes standards and testing protocols for encryption and key management. Over the last two years PCI (and related guidance from Visa), the HITECH Act, state privacy laws, and other regulations have specifically referenced NIST for data encryption and key management standards.

Companies and organizations acquiring data protection technologies should look carefully at how solutions match up to the standards. And a word of warning here: There is still a lot of snake oil in the data protection industry. Be sure that your data protection vendor can prove that their solutions actually meet the NIST standards. This is actually not hard to independently verify – NIST publishes on-line lists of vendors who certify their solutions to the standard.

Encryption is a well defined technology to protect data through the use of an encryption algorithm and secret key. When combined with proper key management, encryption provides a well accepted method of protecting sensitive data. There is a long history of work by professional cryptographers and NIST on defining how good encryption and key management should work, and you can easily determine which vendors meet the standard through the certification process.

Tokenization is a new technology and lacks the history and standards of encryption, but which incorporates encryption technologies. Tokenization works by substituting a surrogate value (or “token”) for the original data. By itself the token does not tell you anything about the original value which might be a credit card number, patient ID, and so forth. But tokenization requires that you use good encryption practices to protect the sensitive data in the token database. This means you have to use a tokenization solution that meets the same stringent standards for encryption and key management. When acquiring a tokenization solution, you will want to use the same diligence about encryption and key management that you would use for a pure encryption solution – that is, the solution should be built to standards and the vendor should be able to prove it through the NIST certification process.

Remember, a tokenization solution will be IN SCOPE for a PCI audit!

Tokenization standards are still evolving. Bob Russo of the PCI Security Standards Council indicated that the council will be taking up work on this in 2010. Visa just released a best practices guide for tokenization (you can get it here), and you can probably expect the eventual standards to incorporate much of this guidance. Additionally, the X9 organization is also working on standards for tokenization.

In regards to tokenization standards, stay tuned ! Much more is coming our way.

Encryption, tokenization, and key management – this is the trifecta for protecting data at rest. I’ll have more comments in the future about tokenization as we analyze the best practice guidance from Visa and help you connect the dots with our encryption, tokenization, and key management solutions.

Patrick

Topics: Encryption, NIST, Key Management, PCI DSS, tokenization

PCI DSS 2.0 and Encryption Key Management

Posted by Patrick Townsend on Feb 11, 2011 11:46:00 AM

2014 UPDATE:
No Significant Changes around Key Management in PCI DSS v3.0

PCI DSS 2.0 encryption

The new PCI Data Security Standards (PCI DSS v2.0) are here and I’ve gotten a lot of questions about the changes related to encryption key management. Because we work with a lot of companies going through PCI compliance audits and reviews, the new standards just confirm the trends we’ve seen over the last few months on how QSA auditors and security professionals view encryption key management, and what they see as the minimum requirements for managing keys.  The clear trend is to require that encryption keys be stored separately from the data they protect, and to make sure that the people who manage encryption keys are not the people who manage the protected data. Let’s look at why this is happening.

PCI DSS Encryption Key Management Compliance While most of the largest merchants in the Level 1 category are already using professional key management solutions to protect encryption keys, the trend over the last 12 months is to require smaller merchants in the Level 2 and Level 3 categories to also use better key management practices, too. So, what are the parts of PCI DSS that are driving this change?  It all has to do with industry best practices for encryption key management, and the concepts of Dual Control, Separation of Duties, and Split Knowledge. These best practices and concepts work together to form the basis for determining if your approach to key management will pass muster.

First, what is the source of industry best practices for key management? Here in the US, the National Institute for Standards and Technology (NIST) is the most common source for guidance on best practices. The NIST special publication SP-800-57 provides specific pointers on how best practices for both procedurally managing encryption keys, and what to look for in key management systems. In these documents you will find the genesis of most standards regarding encryption key management, including the concepts in PCI DSS 2.0 Section 3.

Next, it is important to understand Dual Control, Separation of Duties, and Split Knowledge. These are all clearly defined in the PCI DSS standard and in the accompanying PCI DSS glossary. I’ve extracted the exact definitions below, but I’ll recap them here from the point of view of key management.

Dual Control means that no one person should be able to manage your encryption keys. Creating, distributing, and defining access controls should require at least two individuals working together to accomplish the task.

Separation of Duties means that different people should control different aspects of your key management strategy. This is the old adage “don’t put your eggs in one basket”. The person who creates and manages the keys should not have access to the data they protect. And, the person with access to protected data, should not be able to manage encryption keys.

Split Knowledge applies to the manual generation of encryption keys, or at any point where encryption keys are available in the clear. More than one person should be required to constitute or re-constitute a key in this situation.

What are the practical implications of these best practices and core concepts?  One of the practical implications follows from a common fact of system administration. On all major operating systems such as Linux, Windows, and IBM System I (AS/400) there is one individual who has the authority to manage all processes and files on the system. This is the Administrator on Windows, the root user on Linux and UNIX, and the security officer on the IBM System i platform. In fact, there are usually multiple people who have this level of authority. In one study by PowerTech, the average IBM System i customer had 26 users with this level of authority!

That’s why storing encryption keys on the same system where the protected data resides violates all of the core principles of data protection, and that’s why we are seeing auditors and payment networks reject this approach. If you haven’t faced this issue yet, your day is probably coming. Now is the time to start planning on how to deal with the problem.

Over two years ago we saw this trend developing and took action to help merchants be prepared for proper key management. We created the Alliance Key Manager solution and released it to our partner channel in 2009. This year we released it for direct sale, and last week we received our FIPS-140-2 certification from NIST. Over 1,000 customers are now using AKM to protect their encryption keys with a solution that provably meets industry standards.  Our encryption products have been updated to use this new key management solution, and we are moving customers forward to compliance. It’s been a long, hard slog to NIST FIPS-140 certification, but I think our customers will benefit from the effort.

I hope this has been helpful in clarifying key management best practices. For more information on PCI and key management, download our podcast titled "Key Management Best Practices: What New PCI Regulations Say." Please let us know if you have any questions.

Click me

---

From the PCI DSS version 2.0 Glossary:

Dual control
“Process of using two or more separate entities (usually persons) operating in concert to protect sensitive functions or information. Both entities are equally responsible for the physical protection of materials involved in vulnerable transactions. No single person is permitted to access or use the materials (for example, the cryptographic key). For manual key generation, conveyance, loading, storage, and retrieval, dual control requires dividing knowledge of the key among the entities. (See also Split Knowledge).”


Separation of Duties
“Practice of dividing steps in a function among different individuals, so as to keep a single individual from being able to subvert the process.”

Split knowledge
“Condition in which two or more entities separately have key components that individually convey no knowledge of the resultant cryptographic key.”

Source documents are available online at www.pcisecuritystandards.org

Topics: Compliance, Encryption, Key Management, PCI DSS, PCI, FIPS-140

Encryption and Key Management in a .NET World

Posted by Patrick Townsend on Feb 11, 2011 11:41:00 AM

encryption key managementAs we’ve developed more solutions for the Microsoft Windows platform I’ve come to appreciate the rich set of programming languages that Microsoft provides to developers, and the deep support they provide to both beginner and advanced programmers.  There is a large group of experts inside of Microsoft and in the outside developer community that generously supports Windows developers. Whether you develop in .NET, or C#, no one has done better than Microsoft and generating and supporting this developer community. At a recent SQL PASS conference I saw T-Shirts with the slogan:

    “2.7 Million Geeks Can’t Be Wrong”
    
What’s surprising about that 2.7 million number is that it is not surprising. There may actually be that many Microsoft developers out there.

Encryption and key management have become big challenges for Microsoft developers as various compliance regulations mature and require separation of encryption keys from the data they protect (see my previous blogs on this issue).  In the past a Microsoft developer might have used the Windows DPAPI to store encryption keys, or might have stored them in a SQL Server database. That approach can’t meet the requirements for Dual Control, Separation of Duties, and Split Knowledge required by good security practice and compliance regulations such as PCI DSS. These .NET, and C# developers are now changing their encryption strategy to incorporate external key managers into their applications.

That’s where a Microsoft developer often encounters some friction. Key management vendors are generally not responsive to the needs of a Microsoft developer and fail to provide interfaces that work naturally in this environment. Complex DLL implementations that require special .NET wrapper code, poor integration with the existing .NET encryption APIs, and the absence of quality sample code makes life difficult for the Microsoft developer. And this means that application development slows down and gets more expensive. That’s bad news in a Windows developer world.

I think we’ve done a lot of things right for Microsoft developers with our Alliance Key Manager solution. We provide a .NET assembly key retrieval library that integrates naturally in all of the Microsoft development languages. We also provide for key retrieval directly into .NET applications so that developers can use the native .NET encryption libraries. By not forcing server-based encryption or the use of special encryption libraries, the developer can decide for themselves the best approach to encryption once they have an encryption key retrieved to their application. This approach also supports Microsoft’s Managed Code architecture.

Developers also need some good example code to help speed development. So we’ve provided that with our Alliance Key Manager solution. You can install a working .NET GUI application that retrieves encryption keys from the server, and the install includes the Visual Studio project and source code.  And there are C# examples if you develop in that Microsoft language.

The last thing we’ve done to support the Microsoft Windows platform is integrate our encryption key retrieval routines with the Windows certificate store and native Windows communications facility. When a Windows application authenticates to the Alliance Key Manager server the certificates used for the secure TLS connection are under Windows security and control. And the TLS communications is done with native Windows communications APIs. This reduces the chance of loss of certificates and private keys, supports the MMC management of certificates, and integrates with Microsoft’s patch update strategy. That is just one less component to worry about.

The combination of an affordable FIPS-140-2 compliant key management solution with deep support for the Microsoft developer makes our Alliance Key Manager a great option for Windows users who need to meet security best practices and compliance regulations for key management.

You can get more information about Alliance Key Manager here.

And more information about support for the Microsoft .NET environment here.

Happy programming!

Patrick

Topics: Encryption, Key Management, C#, Microsoft, .NET

XML, Web Services, and Encryption

Posted by Patrick Townsend on Dec 15, 2010 11:29:00 AM

XML, Web Services, EncryptionOne clear direction I’ve observed over the last few months is the focus of QSA auditors and other security professionals on the protection of sensitive data AFTER it traverses the Internet and then lands in a database on a hard disk drive. We have really good ways of protecting data in transit using 128-bit SSL encryption. For example, the web protocols HTTPS and FTPS provide for the ability to encrypt the data in transit, and Secure Shell SSH also provides strong encryption. But after the data reaches the end point of its journey it lands on a hard drive somewhere, and it is often exposed to loss at that point. I believe that’s why security auditors are putting a lot of emphasis now on making sure that data is encrypted when it hits a hard drive.

Many companies have implemented web services in combination with the XML data standard to take advantage of low cost, real time integration with their customers and vendors. When you combine the ubiquity of the web HTTPS protocol with the W3C XML standard you get a power incentive to use this platform for business integration.
 
But care should be given to what happens to data when it leaves the realm of encrypted transit and lands on server hard drives.

Of course, the right thing to do is encrypt sensitive data before it lands on the hard drive. This means that the tools you are using have to support encryption as a natural part of the process of converting XML data. Standard XML processing tools such as Xerces and Xpath do not have built-in encryption. The same is true for XML toolkits and APIs provided by IBM, Microsoft, and others. This leaves it to developers to try to intercept data after it is transformed from XML and before it lands in a database table or on a hard drive. That’s a real challenge.

In our Alliance XML/400 web services product on the IBM platform we built encryption right into the data transformation process about four years ago. Alliance XML/400 customers can protect sensitive data by just enabling the encryption option on a translation map. The solution does the rest. The data is encrypted before insertion into the database and there is no exposure as the data lands in the database on the hard drive. Our customers are taking advantage of this feature to meet PCI and other compliance regulations.

For non-IBM System i environments we now provide an easy way to retrieve encryption keys and perform encryption in a variety of development languages such as Microsoft .NET, Java, and C/C++.

Encryption can help protect against another common threat, too. At the annual PCI SSC standards council meeting in Orlando this year, forensics expert Chris Novak of Verizon talked about how more than 75 percent of data loss events begin with a well known weakness that hasn’t been patched, and half of these are based on SQL injection attacks. With SQL injection, the attack on your servers starts with bad data inserted into a database in the clear, leaving open a later exploit. There are ways to prevent SQL injection through programming techniques, but encryption will also help defeat them.

Will encrypting your data provide all of the security protection you need? Certainly not. I like to think of it this way:  Wearing a parachute on a skydiving expedition is no guarantee that you won’t be hurt when you land.  But that doesn’t mean you wouldn’t wear one, right? I think of encryption in the same way.

To view a replay of a recent webinar we presented on XML & Web Services, click here.

Patrick

Topics: Encryption, HTTPS, HITECH, HIPAA, AES, PCI, SFTP, web services, XML, FTPS, SSL/TLS, SSL