Townsend Security Data Privacy Blog

Patrick Townsend

Recent Posts

RSA 2011 Security Take Away: Mobile Two Factor Authentication is Hot

Posted by Patrick Townsend on Feb 28, 2011 8:26:00 AM

two factor authenticationOne thing that jumped out at me at this year’s RSA conference in San Francisco was the number of new vendors showing off mobile identification solutions.  There were at least four new vendors of mobile-based two factor authentication solutions, and one regular exhibitor with a new entry in this area. These vendors didn’t have the biggest booths or the most lavish give-aways, but as a category they certainly made a big splash.

I think there are really two things responsible for this big change:  Two factor authentication is now more important for security, and everyone now carries a cell phone or mobile device. The second part of this is completely obvious. In fact, I often see people carrying multiple cell phones. The ubiquity of the  cell phone makes them an ideal platform to deliver a one-time password or PIN code. And phone numbers are a lot easier to manage than hardware tokens.

The first part of this, the change in the security landscape, is not as well known to many people. As we’ve moved to a de-perimeterized security reality, we are more dependent on passwords to authenticate the users of our systems. And security professionals know how weak that dependence is. People who access our systems persist in the use of weak passwords, and the bad guys get better and better at password cracking and harvesting. By itself, password authentication is a poor defense, and that’s why two factor authentication is getting a lot of attention.

So what is two factor authentication? It means that you use two different authentication methods to access a system. Those authentication methods include:


•    Something you know (like a password or PIN code)
•    Something you are (fingerprint, iris)
•    Something you have (cell phone, HID card, hardware token)

By combining two of these authentication methods during system access you greatly reduce the chance of a security breach. For web applications, you generally find the use of a password with a PIN code generated with a hardware token (something you know, something you have), because it really hard to use a fingerprint reader or iris scanning device (something you are).  And that’s why cell phone based two factor authentication is picking up steam.

Don’t be confused by security systems that use one factor twice. I’m sure you’ve seen it at work on banking web sites. First you enter a password, then you answer a personal question (where were you born, the age of your oldest child, etc.). This is one factor authentication (something you know) used twice. This is when 2 times 1 is not equal to 2.  The use of one factor authentication twice does not add up to two factor authentication, and does not provide the same level of security.

Cell phones and mobile devices are a great way to deliver that second authentication factor. You have to have your cell phone to get the one time PIN code used for authentication. And everyone has one.

For more information on data security and compliance issues, visit the regulatory compliance section of our website to learn more.

Patrick

Topics: system security, two factor authentication, mobile identification

Data Privacy in a De-Perimeterized World

Posted by Patrick Townsend on Feb 25, 2011 8:33:00 AM
De-perimeterizationI just listened to a discussion of database security hosted by Oracle that was very well done. At one point the discussion turned to current threats and how the Enterprise has lost the ability to use perimeter protection for sensitive data. This has been a topic of much discussion in the security area for the last few months. Perimeter protection is based on the use of Firewall and similar technologies to keep the bad guys out, but this approach is failing with the advance of more sophisticated attacks, the use of social media by large organizations, the advance of mobile technologies, insider threats, and the migration of applications to cloud platforms. The trend is called “de-perimeterization” and represents a bit of a challenge to organizations that need to protect sensitive data.

Vipin Samir and Nishant Kaushik did a great job of describing the how the process of de-perimeterization has forced companies to fall back on user access controls to protect data. But user access controls are notoriously weak.  Weak passwords and sophisticated password cracking routines make it almost impossible to properly secure a database. So what is a security administrator to do?

Here are the suggestions from the panel that are a part of a defense-in-depth strategy:

Use Encryption to Protect Data:
Companies should use encryption at the database level or column level to protect data. This will secure data at rest on backup tapes and on disk in the event a drive is replaced. Encryption is an important part of the data protection strategy, but it needs to be combined with other techniques.

Use Good Key Management:
Protecting encryption keys is the most important part of the encryption strategy. Good key management techniques are needed, and the keys must be separated from the data they protect. Without this separation from protected data it is impossible to implement separation of duties and dual control – important parts of the key management strategy. See our Alliance Key Manager solution for more information about securing encryption keys.

Separation of Duties:
Because the threat from insiders is rising, it is important that the management of encryption keys be separate from the management of databases. Database administrators responsible for our relational databases should not have access to encryption key management, and security administrators should not manage databases. This is a core principal in data security regulations such as PCI DSS, but is often overlooked.

Context Sensitive Controls and Monitoring:
The last important step is to be sure that data access controls are sensitive to the data and its context. Bill in shipping has access to the order database, but should he really be decrypting the credit card number? Does your encryption solution detect and block this type of event? How will you monitor this security event? Or, Sally is authorized to view HR data from the accounting application, but should she really be using FTP to transfer this data? Normal encryption functions would not provide adequate protection from these types of data access. Context sensitive controls are needed to augment encryption.

When we started planning for automatic encryption in our Alliance AES/400 product two years ago, we took care to implement context sensitive controls right in the decryption APIs. That is now available in V7R1 of the IBM i operating system. We avoided the error of basing these controls on user account authorities and native OS security. Just because the operating system says you have read access rights to a database table, doesn’t mean you should be decrypting the social security number or using FTP to transfer the file. I’m happy with our implementation that is based on explicit authorization by a security administrator, and application white lists.

You can get more information and request an evaluation version of our Alliance AES/400 solution here.

You can find the Oracle presentation here. Look for “How secure is your Enterprise Application Data?”

Patrick

Topics: Key Management, De-Perimeterization, Oracle, Separation of Duties, Alliance AES/400, Encryption Key Management, Defense-in-Depth, automatic encryption, AES Encryption

The Uncomfortable Truth About Key Management on the IBM i

Posted by Patrick Townsend on Feb 15, 2011 10:21:00 AM

IBM i Key ManagementLast week we presented a webinar on PCI requirements for encryption key management.  Many of the people who attended were encrypting data on the System i and curious about how to manage their encryption keys according to new PCI requirements. 

The way organizations are managing encryption keys is falling under more scrutiny by QSA auditors.  Companies must demonstrate they are enforcing dual control and separation of duties.  

How is this achieved on the System i? Is it still effective to use an integrated key management solution that is store encryption keys in the same partition as the encrypted data?  The answer, quite simply,  is "no" PCI DSS requirement states the following requirements for key management:

 

  • Dual Control means that at least two people should be required toauthenticate before performing critical key management tasks.
  • Separation of Duties means that the individuals managing encryption keys should not have access to protected data such as credit cards, and those that have access to protected data should not have the authority to manage encryption keys.

Now here is an uncomfortable truth.  On the IBM i you simply can't achieve these PCI requirements if you store the encryption key in the same partition as the protected data.  The QSECOFR user profile (and any user profile with *ALLOBJ authority) will always have complete access to every asset on the system.  An *ALLOBJ  user can circumvent controls by changing another user's password, replacing master keys and key encryption keys, changing and/or
deleting system logs, managing validation lists, and directly accessing database files that contain encrypted data.  

From the perspective of PCI, an integrated key management system puts too much control into the hands of any one single individual.
The only way to comply with PCI requirements for key management is to store the encryption keys off of the IBM i.  Take users with *ALLOBJ out of the picture completely.  When you use a separate appliance to manage encryption keys you can grant a user access to the protected data on the System i and deny that same user access to the key manager.  Now you have enforced separation of duties.  And with the right key management appliance you can require TWO users to authenticate before keys can be managed, and have dual control of encryption keys.

Let us know about the key management strategy at your company. Is your organization encrypting data on the System i?  How are you managing the encryption keys? If you store them on a separate partition, have you had a recent PCI audit?  What did your auditor say?

Topics: Compliance, Encryption, Key Management, IBM i, encryption key

A Big Win for IBM i Customers and Townsend Security

Posted by Patrick Townsend on Feb 11, 2011 1:35:00 PM

managed file transferEarlier this month we released a comprehensive upgrade to our secure managed file transfer solution – Alliance FTP Manager. This latest release incorporates a number of existing Townsend Security products that were previously priced separately and features new capabilities. FTP Manager 5.2 brings together the existing products Alliance FTP Security, Alliance Cross Data, and Alliance All-Ways secure into a single product.

This is a really a big win for our existing customers as well as IBM i customers. Our existing Alliance FTP Manager customers automatically receive the upgrade to FTP Manager 5.2 and so do all of our existing Cross Data and All-Ways Secure customers. We know there are a variety of security challenges facing IBM i customers who send data over networks and FTP Manager 5.2 provides those customers with the most comprehensive and flexible Secure Managed File Transfer offering available.

Highlights of FTP Manager 5.2 include encrypted PDF and encrypted Zip functionality. The new encrypted PDF functionality allows customers to generate encrypted and un-encrypted PDF documents using a familiar interface. And now that FTP Manager 5.2 fully supports zip encryption to the WinZip standard it will provides IBM i customers with a new tool to meet compliance regulations. Users can create Zip files on the IBM i platform and then use a variety of delivery methods to send the Zip files to customers, vendors, and employees.

There are a lot of exciting things on the horizon for Townsend Security and our customers. This release of FTP Manager 5.2 is just the start…..October promises to be a month full of major accomplishments and milestones.

Topics: Alliance FTP Manager, Secure Managed File Transfer, ZIP, PGP, PDF

Non-Standard Encryption – Now That Bites

Posted by Patrick Townsend on Feb 11, 2011 1:33:00 PM

CUSP EncryptionIn our encryption practice we often help customers integrate the exchange of encrypted data between different applications within the organization, and between their own applications and a vendor’s or customer’s application. It is truly amazing to me how often we encounter non-standard encryption that makes this integration very difficult. The problem is not the lack of standards for encryption. Most compliance regulations provide clear guidance and references to encryption standards. Here is what the PCI Data Security Standard (PCI DSS) Navigation Guide says about encryption (my emphasis):

The intent of strong cryptography (see definition and key lengths in the PCI DSS and PA-DSS Glossary of Terms, Abbreviations, and Acronyms) is that the encryption be based on an industry-tested and accepted algorithm (not a proprietary or "home-grown" algorithm).

Strong Encryption:
Cryptography based on industry-tested and accepted algorithms, along with strong key lengths and proper key-management practices. Cryptography is a method to protect data and includes both encryption (which is reversible) and hashing (which is not reversible, or “one way”). Examples of industry-tested and accepted standards and algorithms for encryption include AES (128 bits and higher), TDES (minimum double-length keys), RSA (1024 bits and higher), ECC (160 bits and higher), and ElGamal (1024 bits and higher). See NIST Special Publication 800-57 for more information.

The problem seems to be a general lack of knowledge about encryption. And this applies to some vendors of encryption solutions. Here are a couple of examples:

One of our customers was having trouble decrypting a field with our software that was encrypted on a Windows server with “256-bit AES using CBC mode”. That seemed to be a pretty straight-forward task. Yet we couldn’t get the data decrypted properly. The output just looked like garbage. We spent a fair about of time with the customer verifying that they had the right decryption key, that the initialization vectors for the decryption were specified correctly, and that our software was being used correctly. But nothing was working. We then asked the third party software vendor to share their AES source code with us.  In this case the vendor was very accommodating and let us review their source code implementation of AES encryption.

Voila! The source code showed that the implementation was using a 256-bit block size for the encryption algorithm. The AES standard (FIPS-197) requires the use of 128-bit block sizes. The use of the larger block size meant that this was not even AES encryption according to the industry standard. The vendor fixed their implementation and the project was successful. But our customer spent a lot of time and money resolving the problem.

Another example of getting into trouble occurred with a customer who deployed an AES encryption solution that used the CUSP mode of encryption. This rang alarm bells right away.  We knew that CUSP was not one of the NIST approved modes of encryption, and we had never encountered it before. We quickly learned that CUSP stood for “Cryptographic Unit Service Provider” and was implemented by IBM in a couple of their server products. This customer had a couple of problems. CUSP mode was not a standard mode of encryption, and data encrypted with CUSP was not going to be decrypted by any of the standard open source or commercial products in the market. So this customer was locked into an incompatible encryption strategy.

The second problem is that the CUSP mode of encryption is a proprietary protocol.  The PCI DSS clearly states that strong encryption must be based on industry standards and not proprietary protocols (see above).  As I interpret the PCI DDS requirements for encryption, a customer using CUSP mode would be out of compliance. That’s going to hurt the next time a QSA takes a hard look at your encryption implementation.  We recently published a white paper on the weaknesses of the CUSP mode of encryption.  Click here to download it.

One way to insure that your AES encryption software is compliant is to look for a NIST certification. A NIST AES Validation certificate, or a NIST FIPS-140 certificate, is pretty good assurance of compliance. The FIPS-140 certification process requires AES Validation, so that certification is incorporated by reference. That’s why either certification will give you the assurance that AES encryption is being done according to the standard. Lacking certification, you are relying on the promises of a vendor or developer who may be ignorant, or have a motivation to be less than forthcoming. Not a good situation to find yourself in.

Both of these customers spent a fair amount of money fixing the problem.  An entirely avoidable expense.

Patrick

Topics: Encryption, NIST, CUSP, PCI DSS, AES, PCI, FIPS-140

SQL Security Attacks: Same Ole, Same Ole

Posted by Patrick Townsend on Feb 11, 2011 1:29:00 PM

SQL Security AttacksThe PCI conference is just finishing up and I’ll have more on the conference a bit later when information can be made public. One of the interesting talks was by Chris Novak of Verizon. I just want to recap some of his thoughts.

The number of highly targeted and sophisticated attacks is on the rise. We’ve heard this message over the last few months, and the Verizon study confirms this. A highly targeted attack goes after a specific companies’ assets with a lot of knowledge about the internal systems. The ability of the attacker to cloak the software is good, and the sophistication of the attack is high. These are highly knowledgeable technicians creating the malware, and they are getting good at attacking sophisticated systems like such as Hardware Security Modules (HSMs) that store or process encryption keys. The targets are usually companies with a large concentration of high value financial information. It’s expensive to launch these types of attacks, and the payoff is high. When these breaches are discovered they get a lot of press. But here’s an interesting fact: These sophisticated attacks on specific targets are still in the minority. Over 80 percent of the breaches are pretty low-tech. That’s right, amazingly most of the attacks are against web servers and most of these use SQL injection as the way to get inside. What’s stunning is that we’ve known about SQL Injection attacks for a long time. We know how the attacks are made, we know how to test for the weakness, and we know how to remediate the problem. So why haven’t we made much progress in preventing this exposure?

First, we aren’t paying enough attention to our web sites which are not directly related to credit card process. Novak used the example of an attack on a companies’ HR web site. The company posted job openings on the site and did not think it was much of a target. But attackers gained entry to this job postings web site, and then navigated through the internal network to a high value target.

The lesson? We know what to do, we just need to apply that knowledge more broadly.

We might also ask why are we still having a problem with SQL injection? Novak had an interesting take on this, too. To prevent a SQL injection attack you have to use good programming practices. You can’t just plug in an intrusion detection device and think you are safe. You prevent SQL injection by changing the way you develop web and business applications. Do you know what OWASP is? If not, there’s a good chance you are exposed somewhere in your application code to SQL injection.

The lesson? We have to get much more serious about secure programming. If you are a developer of web or business systems, you should know the OWASP Top 10 forwards and backwards. If you manage a development group, be sure everyone is trained on secure programming practices. Make it a requirement for hiring and promotion.

I will leave you with one more interesting point from Novak’s talk. In almost all breaches that Verizon studied, there was enough information in the system logs to identify the attack, but the logs were not reviewed and the attack went undetected.

The lesson? We need to monitor our system logs a lot better. This means investing in software that can automatically sort through the large number of logs and tell us when there is a problem.

There are positive changes under way at the PCI council and I will discuss these more in the days ahead. But one big take-away is that we need to do better what we already know we should be doing. This part is not rocket science.

Patrick

Topics: HSM, Verizon, SQL, Security Attacks

Encrypted PDF & ZIP on the IBM i

Posted by Patrick Townsend on Feb 11, 2011 1:14:00 PM

encrypted pdf and zipIBM i (AS/400, iSeries) users send a lot of sensitive information to their customers, vendors, and employees which needs to be protected with strong encryption. Our customers today are using our PGP encryption solution to protect files. But there has been a big need to generate and protect information in common PC formats. With our upcoming release of Alliance FTP Manager for IBM i, we are stepping up our support with encrypted Zip files and encrypted PDF files.

Zip compression is very commonly used to send files via email. Not only does zip compression make our email attachments smaller, but the most popular zip compression programs now support 256-bit AES encryption of the contents. The ability to encrypt Zip files with AES provides a much better level of security than older zip protection methods. In the new release of Alliance FTP File Manager for IBM i we fully support Zip encryption to the WinZip standard. This means that you can create and protect Zip files on your IBM i platform, and then use a variety of delivery methods to get the Zip files in the hands of your customers, vendors, and employees. This will immediately give IBM i customers a new tool to meet compliance regulations.

The new Zip support provides rich capabilities to IBM i users. You can create encrypted or un-encrypted zip archives, include sub-directories, and use wild cards to select files. When uncompressing and decrypting you can specify any directory as the target for the files. This capability integrates with our automation facilities for processing received files. Lastly, we provide a Windows command line Zip application to help our customers who don’t already have a Zip application. I’m confident that this new capability will help our customers achieve a better level of security.

The other new security technology in FTP Manager for IBM i is our encrypted PDF support. In this first implementation, our customers will be able to create encrypted PDFs with their own content, and then use the automation facilities to distribute the PDFs via email, FTP, and other distribution methods. Encrypted PDF support includes the ability to set fonts and colors, embed watermark and graphic images, set headers and footers, and create tables and lists. The resulting encrypted PDF file is compatible with any PDF reader that supports the AES encryption standard for PDF. We’ve tested with a wide variety of PDF readers on PCs, Apple Macs, Blackberry, Linux desktops, and so forth. This will give our customers an additional tool to secure their sensitive data.

These new technologies for the IBM i customer will increase their abilities to meet compliance regulations and secure sensitive data. I hope you get the idea that we are dedicated to helping you protect your sensitive data and corporate assets. You are going to see a lot more of these types of new capabilities as we go forward.

Patrick

Topics: IBM i, ZIP, FTP Manager for IBM i, PGP, PDF

Encryption, Key Management and Tokenization - The Trifecta

Posted by Patrick Townsend on Feb 11, 2011 11:47:00 AM

encryption, tokenizaiton, & key managementCompliance regulations are moving inexorably towards requiring the protection of sensitive data. The private information of customers, employees, patients, vendors and all of the people we come into contact with as Enterprises, must be protected from loss and misuse. At the same time that regulations are getting more teeth, there is more consensus about the technologies to protect data in our applications and databases. Encryption and tokenization are now the accepted methods of protecting data, and encryption key management is central to both technologies.

How fast are regulations changing? Really fast. The Payment Card Industry Security Standards Council will update the PCI Data Security Standard (PCI DSS) this year, and will be on a three year cycle of updates. Merchants accepting credit cards will have about 18 months to implement the changes. State privacy laws are undergoing frequent changes, most of which make the rules more stringent. Minnesota, Nevada, and Washington State have made recent changes. The HITECH Act of 2009 and related guidance further tightens the rules around protecting patient data, and further guidance is expected this year. Last, but not least, the federal government is moving new legislation through Congress to enact a national privacy law.

These changes are coming fast, and they have one thing in common: data protection requirements are getting stronger, not weaker. Companies and organizations should be paying attention to their data protection strategies now in order to avoid expensive rip-and-tear operations in the future.

One other tendency of the evolving regulations is this: A clear reference to standards for data protection. All of the mentioned standards now make reference to widely accepts standards, usually those of the National Institute of Standards and Technology (NIST) which publishes standards and testing protocols for encryption and key management. Over the last two years PCI (and related guidance from Visa), the HITECH Act, state privacy laws, and other regulations have specifically referenced NIST for data encryption and key management standards.

Companies and organizations acquiring data protection technologies should look carefully at how solutions match up to the standards. And a word of warning here: There is still a lot of snake oil in the data protection industry. Be sure that your data protection vendor can prove that their solutions actually meet the NIST standards. This is actually not hard to independently verify – NIST publishes on-line lists of vendors who certify their solutions to the standard.

Encryption is a well defined technology to protect data through the use of an encryption algorithm and secret key. When combined with proper key management, encryption provides a well accepted method of protecting sensitive data. There is a long history of work by professional cryptographers and NIST on defining how good encryption and key management should work, and you can easily determine which vendors meet the standard through the certification process.

Tokenization is a new technology and lacks the history and standards of encryption, but which incorporates encryption technologies. Tokenization works by substituting a surrogate value (or “token”) for the original data. By itself the token does not tell you anything about the original value which might be a credit card number, patient ID, and so forth. But tokenization requires that you use good encryption practices to protect the sensitive data in the token database. This means you have to use a tokenization solution that meets the same stringent standards for encryption and key management. When acquiring a tokenization solution, you will want to use the same diligence about encryption and key management that you would use for a pure encryption solution – that is, the solution should be built to standards and the vendor should be able to prove it through the NIST certification process.

Remember, a tokenization solution will be IN SCOPE for a PCI audit!

Tokenization standards are still evolving. Bob Russo of the PCI Security Standards Council indicated that the council will be taking up work on this in 2010. Visa just released a best practices guide for tokenization (you can get it here), and you can probably expect the eventual standards to incorporate much of this guidance. Additionally, the X9 organization is also working on standards for tokenization.

In regards to tokenization standards, stay tuned ! Much more is coming our way.

Encryption, tokenization, and key management – this is the trifecta for protecting data at rest. I’ll have more comments in the future about tokenization as we analyze the best practice guidance from Visa and help you connect the dots with our encryption, tokenization, and key management solutions.

Patrick

Topics: Encryption, NIST, Key Management, PCI DSS, tokenization

PCI DSS 2.0 and Encryption Key Management

Posted by Patrick Townsend on Feb 11, 2011 11:46:00 AM

2014 UPDATE:
No Significant Changes around Key Management in PCI DSS v3.0

PCI DSS 2.0 encryption

The new PCI Data Security Standards (PCI DSS v2.0) are here and I’ve gotten a lot of questions about the changes related to encryption key management. Because we work with a lot of companies going through PCI compliance audits and reviews, the new standards just confirm the trends we’ve seen over the last few months on how QSA auditors and security professionals view encryption key management, and what they see as the minimum requirements for managing keys.  The clear trend is to require that encryption keys be stored separately from the data they protect, and to make sure that the people who manage encryption keys are not the people who manage the protected data. Let’s look at why this is happening.

PCI DSS Encryption Key Management Compliance While most of the largest merchants in the Level 1 category are already using professional key management solutions to protect encryption keys, the trend over the last 12 months is to require smaller merchants in the Level 2 and Level 3 categories to also use better key management practices, too. So, what are the parts of PCI DSS that are driving this change?  It all has to do with industry best practices for encryption key management, and the concepts of Dual Control, Separation of Duties, and Split Knowledge. These best practices and concepts work together to form the basis for determining if your approach to key management will pass muster.

First, what is the source of industry best practices for key management? Here in the US, the National Institute for Standards and Technology (NIST) is the most common source for guidance on best practices. The NIST special publication SP-800-57 provides specific pointers on how best practices for both procedurally managing encryption keys, and what to look for in key management systems. In these documents you will find the genesis of most standards regarding encryption key management, including the concepts in PCI DSS 2.0 Section 3.

Next, it is important to understand Dual Control, Separation of Duties, and Split Knowledge. These are all clearly defined in the PCI DSS standard and in the accompanying PCI DSS glossary. I’ve extracted the exact definitions below, but I’ll recap them here from the point of view of key management.

Dual Control means that no one person should be able to manage your encryption keys. Creating, distributing, and defining access controls should require at least two individuals working together to accomplish the task.

Separation of Duties means that different people should control different aspects of your key management strategy. This is the old adage “don’t put your eggs in one basket”. The person who creates and manages the keys should not have access to the data they protect. And, the person with access to protected data, should not be able to manage encryption keys.

Split Knowledge applies to the manual generation of encryption keys, or at any point where encryption keys are available in the clear. More than one person should be required to constitute or re-constitute a key in this situation.

What are the practical implications of these best practices and core concepts?  One of the practical implications follows from a common fact of system administration. On all major operating systems such as Linux, Windows, and IBM System I (AS/400) there is one individual who has the authority to manage all processes and files on the system. This is the Administrator on Windows, the root user on Linux and UNIX, and the security officer on the IBM System i platform. In fact, there are usually multiple people who have this level of authority. In one study by PowerTech, the average IBM System i customer had 26 users with this level of authority!

That’s why storing encryption keys on the same system where the protected data resides violates all of the core principles of data protection, and that’s why we are seeing auditors and payment networks reject this approach. If you haven’t faced this issue yet, your day is probably coming. Now is the time to start planning on how to deal with the problem.

Over two years ago we saw this trend developing and took action to help merchants be prepared for proper key management. We created the Alliance Key Manager solution and released it to our partner channel in 2009. This year we released it for direct sale, and last week we received our FIPS-140-2 certification from NIST. Over 1,000 customers are now using AKM to protect their encryption keys with a solution that provably meets industry standards.  Our encryption products have been updated to use this new key management solution, and we are moving customers forward to compliance. It’s been a long, hard slog to NIST FIPS-140 certification, but I think our customers will benefit from the effort.

I hope this has been helpful in clarifying key management best practices. For more information on PCI and key management, download our podcast titled "Key Management Best Practices: What New PCI Regulations Say." Please let us know if you have any questions.

Click me

---

From the PCI DSS version 2.0 Glossary:

Dual control
“Process of using two or more separate entities (usually persons) operating in concert to protect sensitive functions or information. Both entities are equally responsible for the physical protection of materials involved in vulnerable transactions. No single person is permitted to access or use the materials (for example, the cryptographic key). For manual key generation, conveyance, loading, storage, and retrieval, dual control requires dividing knowledge of the key among the entities. (See also Split Knowledge).”


Separation of Duties
“Practice of dividing steps in a function among different individuals, so as to keep a single individual from being able to subvert the process.”

Split knowledge
“Condition in which two or more entities separately have key components that individually convey no knowledge of the resultant cryptographic key.”

Source documents are available online at www.pcisecuritystandards.org

Topics: Compliance, Encryption, Key Management, PCI DSS, PCI, FIPS-140

Blackberry, Key Management, and Message Security

Posted by Patrick Townsend on Feb 11, 2011 11:44:00 AM

blackberry securityMany of us have been watching the on-going drama between RIM (makers of the ubiquitous Blackberry) and various governments around the world. Governments have been successfully pressuring RIM to provide access to their internal messaging servers in order to get access to encrypted messages sent and received by Blackberry users. I think RIM has been trying to fight this access as best they can. After all, one of their key product messages is around the security of their systems. In spite of that I suspect some governments have been successful in getting at least limited access to the Blackberry servers that process secure messages.

At first I was puzzled by this story when it started to emerge. I mistakenly thought that the private key needed to decrypt a message was stored on the receiver’s Blackberry and that the intermediate message servers would not have the key necessary to decrypt a message. I was apparently wrong about this architecture and it turns out that the Blackberry message servers do have the ability to decrypt messages in transit. That ability puts RIM in the uncomfortable headlights of law enforcement and security agencies around the world.

People have been asking me if a similar situation exists with other common encryption technologies. For example, when I encrypt a file with PGP can it be decrypted by someone (A government? A credit card thief?) before it reaches the intended recipient. Before the drama with RIM I was not hearing this question, but now I think many people are wondering about it.

The short answer is to the question is No: When you encrypt a file with PGP it is not possible to decrypt it before it gets to the intended recipient. PGP is based on the widely used public/private key encryption technology deployed in many secure systems such as VPNs, web browsers, and secure FTP. When I encrypt some information with a public key, only the person holding the private key can decrypt the information. As long as I protect my private key an intermediary can’t decrypt a message intended only for me. Almost all of our assumptions about security depend on this fact.

Is this system perfect? No. As a recipient of secure messages I may inadvertently disclose my private key or lose it by failing to protect it properly. Also, I may be legally compelled by a government agency to relinquish it. Many governments are now requiring people to disclose their private keys and passwords when ordered by a court to do so. You might think that you can’t be compelled to give up a password or private key, but I think that resolve might fade after a few days of sitting in a jail cell. The bottom line is this: public/private key technology is the best method we have of protecting sensitive information. When done well it prevents anyone but an intended recipient from reading the sensitive information. But it also means that you have to pay attention to how you manage and protect encryption keys. Proper encryption key management is essential to any data protection method you use. We’ll be talking more about this in the days ahead.

Patrick

Topics: security, Key Management, public/private key, Blackberry/RIM, PGP