+1.800.357.1019

+1.800.357.1019

Feel free to call us toll free at +1.800.357.1019.

If you are in the area you can reach us at +1.360.359.4400.

Standard support
6:30am - 4:00pm PST, Monday - Friday, Free

Premium support
If you own Townsend Security 24x7 support and
have a production down issue outside normal
business hours, please call +1.800.349.0711
and the on-call person will be notified.

International customers, please dial +1.757.278.1926.

Townsend Security Data Privacy Blog

.NET Encryption and Key Management

Posted by Patrick Townsend on Aug 13, 2012 10:29:00 AM

Key Management in the Multi-Platform Environment

encryption key management white paper

Download the white paper "Key Management in the Multi-Platform Environment"

Click Here to Download Now

If you have Microsoft SQL Server with Extensible Key Management (EKM), the implementation of encryption and key retrieval with Alliance Key Manager, our encryption key management Hardware Security Module (HSM) is easy. Our HSM comes with the Windows EKM Provider software that you install, configure and deploy.  Problem solved.

But what if you have a significant investment in Microsoft applications that don’t support EKM?

For example, you might have applications built on SQL Server 2005 or SQL Server 2008/2012 Standard Edition which do not support EKM. You could upgrade to SQL Server 2008 R2 or SQL Server 2012, but there might be application roadblocks that prevent the upgrade right now.

Or, you might have applications written in a .NET language that update non-Microsoft databases, or which work with unstructured data.

These technical hurdles won’t stop you from using our encryption key manager to meet compliance requirements for protecting encryption keys. We provide a .NET Assembly and DLL that you can add to your .NET project to retrieve encryption keys from the HSM. A few lines of C# code and you are retrieving the encryption key from the HSM, and the problem is solved.

The sample code on the product CD will get you up and running quickly. There are C# sample applications with source code that you can use as a starting point in your projects. The Alliance Key Manager .NET Assembly works with any .NET language including C#, C, and C++.

The Alliance Key Manager .NET Assembly also works with the Common Language Runtime (CLR) environment, and with Stored Procedures. And you can mix and match your .NET languages, databases, and OS platforms.

The combination of automatic encryption (EKM, TDE, Cell Level Encryption) with the Alliance Key Manager .NET Assembly code means that you won’t have any gaps in your coverage of your Microsoft platforms.  Download our white paper "Key Management in the Multi-Platform Environment" for more information on securing your encryption keys.

Happy coding!

Patrick

Click me

Topics: Alliance Key Manager, Encryption, Key Management, Extensible Key Management (EKM), C#, Microsoft, .NET, SQL Server

Data Privacy for the Non-Technical Person Part 1

Posted by Luke Probasco on Aug 26, 2011 3:54:00 PM

data privacyAs I attend industry events, it is surprising how many times we hear questions like “what constitutes personal information that needs to be protected?”  I recently sat down with Patrick Townsend, our Founder and CTO to discuss data privacy for the non-technical person. 

When speaking about data privacy, the conversation often turns technical with common questions like “How do we implement encryption and encryption key management?”  This time, we intentionally kept our conversation focused on data privacy topics that can be understood from a high-level. 

I have created a series of blog posts from this conversation that will be posted in the next couple weeks.  Hopefully this blog series will answer any questions that you might have.  If you still have questions, feel free to send us an email.

What constitutes personal information that needs to be protected?

The first thing that everyone thinks of are credit cards numbers.  We know that we don’t want our credit card numbers escaping into the wild and having to go through the process of replacing them.  I think that by now, most people have experienced getting a call from their bank, being alerted to potential fraud, and going through the process of having to replace a card.  So credit card numbers are obviously personal information that people need to protect.

There are also other things that I think are important – financial bank account numbers.  We are all doing a little bit more now in terms of online banking.  Those bank account numbers carry value and we need to be very careful about that.  There are also some other items that tend to be used to commit financial fraud, such as social security numbers, driver’s license numbers, birthdate, etc.  In fact, information like your passport number, military ID, or health ID – all of those are examples of information that you should try and protect and make sure you are not sending them around or leaving them in places that can be easily picked up.

Other things like maiden name or previous addresses are also important.  Think about the types of questions your bank asks you when you give them a call.  They are using that information to identify you and the fraudsters will use that information to impersonate you.  These are all examples of sensitive information that we should be protecting.  For people who are interested, the technical term for this type if information is Personally Identifiable Information or PII.

Stay tuned for our next installment in this series.  Download our podcast “Data Privacy for the Non-Technical Person” to hear more of this conversation.


Click me

Topics: Encryption, Key Management, Data Privacy

10 Questions to Ask Your Key Management Vendor

Posted by Luke Probasco on Mar 29, 2011 8:14:00 AM

key managementThe modern Enterprise deploys a variety of server platforms, operating systems, and programming languages.  A major barrier to deploying encryption has been the challenge of accessing encryption keys from these widely divergent environments.  Encryption key management solutions have the primary goal of managing and protecting encryption keys, and making them available to authorized applications in a secure fashion.


Key management solutions vary greatly in the complexity of the key retrieval process. The more complex the key retrieval interface, the greater the challenge for the Enterprise IT team in deploying key retrieval in applications. Understanding this fact can help IT decision makers assess different vendor solutions and the likely costs of deploying a solution in their enterprise.  Below is a list of questions that you should ask your key management vendor when assessing their solution.


Key Management Vendor Checklist

1.  Is your key manager FIPS 140 certified?  What is the certificate number?

2.  How would you describe the encryption key payload as retrieved from the key server?  Is it simple or complex?

3.  Is there a common key retrieval application interface on all platforms?  What are the differences?

4.  What platforms do you support for key retrieval?  (Note any gaps in platform coverage for your company)

5.  Do you provide working sample code for the platforms I need? (Windows, Linux, UNIX, IBM i, IBM z)

6.  Do you supply binary libraries for all Enterprise servers?

7.  Do you have a Java key retrieval class and examples? Is it standard Java or JNI?

8.  Do you charge separate license fees for each client operating system?

9.  Do you require that we purchase consulting services from you?  Why?

10.  I am an independent software vendor (ISV), can you brand the solution and certify the solution for us?
 
Once you have the answer to the above questions, it should be easier to choose the right key management vendor for your Enterprise. If you have any questions, click here and we will call you right back.


 Click me

Topics: Alliance Key Manager, NIST, Key Management, encryption key

Key Management Best Practices: What New PCI Regulations Say

Posted by Luke Probasco on Mar 24, 2011 1:25:00 PM
key managementThe new PCI Data Security Standards (PCI DSS v2.0) are here and we’ve gotten a lot of questions about the changes related to encryption key management. Because we work with a lot of companies going through PCI compliance audits and reviews, the new standards just confirm the trends we’ve seen over the last few months on how QSA auditors and security professionals view encryption key management, and what they see as the minimum requirements for managing keys. I recently sat down with Patrick Townsend, Founder & CTO of Townsend Security, to discuss the new PCI regulations in regards to encryption key management.  To hear an expanded podcast of our conversation, click here.

 

What is the source of industry best practices for key management?

 

The NIST special publication SP-800-57 provides specific pointers on best practices for both procedurally managing encryption keys, and what to look for in key management systems. In these documents you will find the genesis of most standards regarding encryption key management, including the concepts in PCI DSS 2.0 Section 3.

Also, key management solutions are certified to the FIPS-140-2 standard for cryptographic modules. So FIPS-140 is a source of best practices and standards.

 

Dual control, separation of duties, and split knowledge have been buzz topics in the key management world lately.  What do they mean?

 

Well, dual control means that at least two people should be required to authenticate before performing critical key management tasks.

 

Separation of duties means that the individuals managing encryption keys should not have access to protected data such as credit cards, and those that have access to protected data should not have the authority to manage encryption keys.

 

Split knowledge is defined in the PCI DSS version 2.0 glossary as a “condition in which two or more entities separately have key components that individually convey no knowledge of the resultant cryptography key.”

 

Are there any standards or best practices regarding “integrated key management?”

 

“Integrated key management” is a term of art, and not a standard.  If “integrated key management” means “we store our encryption keys on the server where the data is,” then that is a bad thing, from a compliance and security point of view. 

 

So, what are the best practices for encryption key management?

 

First you should follow the key life-cycle and be able to document it.  You should always separate the keys from the data.  If you follow the PCI guidelines, you are in excellent shape.  Finally, I would recommend only using a FIPS 140 certified key management solution.

 

What are the practical implications of these best practices and core concepts such as “dual control” and “separation of duties?”

 

One of the practical implications follows from a common fact of system administration. On all major operating systems such as Linux, Windows, and IBM System i (AS/400) there is one individual who has the authority to manage all processes and files on the system. This is the Administrator on Windows, the root user on Linux and UNIX, and the security officer on the IBM i platform. In fact, there are usually multiple people who have this level of authority. In one study by PowerTech, the average IBM System i customer had 26 users with this high level of authority! You just can’t meet PCI and other industry standards for proper key management by storing the encryption keys on the same platform as the data you are trying to protect.

 

To download an expanded podcast of our conversation, click here.

 

Topics: Key Management, PCI DSS, Best Practices

Data Privacy in a De-Perimeterized World

Posted by Patrick Townsend on Feb 25, 2011 8:33:00 AM
De-perimeterizationI just listened to a discussion of database security hosted by Oracle that was very well done. At one point the discussion turned to current threats and how the Enterprise has lost the ability to use perimeter protection for sensitive data. This has been a topic of much discussion in the security area for the last few months. Perimeter protection is based on the use of Firewall and similar technologies to keep the bad guys out, but this approach is failing with the advance of more sophisticated attacks, the use of social media by large organizations, the advance of mobile technologies, insider threats, and the migration of applications to cloud platforms. The trend is called “de-perimeterization” and represents a bit of a challenge to organizations that need to protect sensitive data.

Vipin Samir and Nishant Kaushik did a great job of describing the how the process of de-perimeterization has forced companies to fall back on user access controls to protect data. But user access controls are notoriously weak.  Weak passwords and sophisticated password cracking routines make it almost impossible to properly secure a database. So what is a security administrator to do?

Here are the suggestions from the panel that are a part of a defense-in-depth strategy:

Use Encryption to Protect Data:
Companies should use encryption at the database level or column level to protect data. This will secure data at rest on backup tapes and on disk in the event a drive is replaced. Encryption is an important part of the data protection strategy, but it needs to be combined with other techniques.

Use Good Key Management:
Protecting encryption keys is the most important part of the encryption strategy. Good key management techniques are needed, and the keys must be separated from the data they protect. Without this separation from protected data it is impossible to implement separation of duties and dual control – important parts of the key management strategy. See our Alliance Key Manager solution for more information about securing encryption keys.

Separation of Duties:
Because the threat from insiders is rising, it is important that the management of encryption keys be separate from the management of databases. Database administrators responsible for our relational databases should not have access to encryption key management, and security administrators should not manage databases. This is a core principal in data security regulations such as PCI DSS, but is often overlooked.

Context Sensitive Controls and Monitoring:
The last important step is to be sure that data access controls are sensitive to the data and its context. Bill in shipping has access to the order database, but should he really be decrypting the credit card number? Does your encryption solution detect and block this type of event? How will you monitor this security event? Or, Sally is authorized to view HR data from the accounting application, but should she really be using FTP to transfer this data? Normal encryption functions would not provide adequate protection from these types of data access. Context sensitive controls are needed to augment encryption.

When we started planning for automatic encryption in our Alliance AES/400 product two years ago, we took care to implement context sensitive controls right in the decryption APIs. That is now available in V7R1 of the IBM i operating system. We avoided the error of basing these controls on user account authorities and native OS security. Just because the operating system says you have read access rights to a database table, doesn’t mean you should be decrypting the social security number or using FTP to transfer the file. I’m happy with our implementation that is based on explicit authorization by a security administrator, and application white lists.

You can get more information and request an evaluation version of our Alliance AES/400 solution here.

You can find the Oracle presentation here. Look for “How secure is your Enterprise Application Data?”

Patrick

Topics: Key Management, De-Perimeterization, Oracle, Separation of Duties, Alliance AES/400, Encryption Key Management, Defense-in-Depth, automatic encryption, AES Encryption

Increased Key Management Awareness at RSA Conference 2011

Posted by Luke Probasco on Feb 16, 2011 9:33:00 AM

key management at RSAAs day three of the RSA Conference 2011 begins, it marks the half-way point through the largest data security tradeshow that the industry has to offer. Walking into the show you would be hard pressed to tell whether you walked into a security show or a grown up play-yard. Look to left you see sumo wrestlers, looking ahead there are unicycles weaving the crowd, and to the right are pirates handing out candy. And to top it all off, each night ends with beer, wine, and appetizers for all attendees. Who wouldn’t want to attend the RSA Conference 2011??!!

As you look past all the gimmicks, the technology is still really what matters.

A noticeable change over the past two years is the increased awareness of FIPS-140 certification for key managers.

We believe this is largely driven by compliance auditors whose demands have evolved from "you must encrypt" to "you can't store your keys with your data" to "you need to use a key manager" and are now converging on "you need a FIPS-140 certified key manager."

As the auditing community matures we expect the requirements for formal government certifications to move from occasional to manditory.

In the past we usually only heard these concerns from sophisticated security architects with very large companies. Now we are seeing this awareness beginning to move through the SMB marketplace.

Prospective partners, future clients and current customers recognize that Townsend Security has done encryption and key management the way that it needs to be done – and proven by NIST and FIPS certifications. If your encryption offering hasn’t been reviewed and certified by NIST, you have no assurance that you aren’t implementing a less than secure product. “I wouldn’t consider an encryption solution that isn’t certified by NIST” is a common statement by attendees at our booth.

Would you like to see first hand how certified encryption and key management will work at your organization? Click on the links to request evaluation versions of AES encryption and Key Manager. One of our security specialists will be in contact with you to make sure you are up and running and answer any questions that you might have.

Or, if you just would like to learn more about encryption and key management, visit the resources section of our web site.

And if you read this while you are still at the RSA Conference 2011, stop by our booth and pick up a little somethin’ special that we have been saving you.

Topics: Key Management, AES, RSA

The Uncomfortable Truth About Key Management on the IBM i

Posted by Patrick Townsend on Feb 15, 2011 10:21:00 AM

IBM i Key ManagementLast week we presented a webinar on PCI requirements for encryption key management.  Many of the people who attended were encrypting data on the System i and curious about how to manage their encryption keys according to new PCI requirements. 

The way organizations are managing encryption keys is falling under more scrutiny by QSA auditors.  Companies must demonstrate they are enforcing dual control and separation of duties.  

How is this achieved on the System i? Is it still effective to use an integrated key management solution that is store encryption keys in the same partition as the encrypted data?  The answer, quite simply,  is "no" PCI DSS requirement states the following requirements for key management:

 

  • Dual Control means that at least two people should be required toauthenticate before performing critical key management tasks.
  • Separation of Duties means that the individuals managing encryption keys should not have access to protected data such as credit cards, and those that have access to protected data should not have the authority to manage encryption keys.

Now here is an uncomfortable truth.  On the IBM i you simply can't achieve these PCI requirements if you store the encryption key in the same partition as the protected data.  The QSECOFR user profile (and any user profile with *ALLOBJ authority) will always have complete access to every asset on the system.  An *ALLOBJ  user can circumvent controls by changing another user's password, replacing master keys and key encryption keys, changing and/or
deleting system logs, managing validation lists, and directly accessing database files that contain encrypted data.  

From the perspective of PCI, an integrated key management system puts too much control into the hands of any one single individual.
The only way to comply with PCI requirements for key management is to store the encryption keys off of the IBM i.  Take users with *ALLOBJ out of the picture completely.  When you use a separate appliance to manage encryption keys you can grant a user access to the protected data on the System i and deny that same user access to the key manager.  Now you have enforced separation of duties.  And with the right key management appliance you can require TWO users to authenticate before keys can be managed, and have dual control of encryption keys.

Let us know about the key management strategy at your company. Is your organization encrypting data on the System i?  How are you managing the encryption keys? If you store them on a separate partition, have you had a recent PCI audit?  What did your auditor say?

Topics: Compliance, Encryption, Key Management, IBM i, encryption key

Encryption, Key Management and Tokenization - The Trifecta

Posted by Patrick Townsend on Feb 11, 2011 11:47:00 AM

encryption, tokenizaiton, & key managementCompliance regulations are moving inexorably towards requiring the protection of sensitive data. The private information of customers, employees, patients, vendors and all of the people we come into contact with as Enterprises, must be protected from loss and misuse. At the same time that regulations are getting more teeth, there is more consensus about the technologies to protect data in our applications and databases. Encryption and tokenization are now the accepted methods of protecting data, and encryption key management is central to both technologies.

How fast are regulations changing? Really fast. The Payment Card Industry Security Standards Council will update the PCI Data Security Standard (PCI DSS) this year, and will be on a three year cycle of updates. Merchants accepting credit cards will have about 18 months to implement the changes. State privacy laws are undergoing frequent changes, most of which make the rules more stringent. Minnesota, Nevada, and Washington State have made recent changes. The HITECH Act of 2009 and related guidance further tightens the rules around protecting patient data, and further guidance is expected this year. Last, but not least, the federal government is moving new legislation through Congress to enact a national privacy law.

These changes are coming fast, and they have one thing in common: data protection requirements are getting stronger, not weaker. Companies and organizations should be paying attention to their data protection strategies now in order to avoid expensive rip-and-tear operations in the future.

One other tendency of the evolving regulations is this: A clear reference to standards for data protection. All of the mentioned standards now make reference to widely accepts standards, usually those of the National Institute of Standards and Technology (NIST) which publishes standards and testing protocols for encryption and key management. Over the last two years PCI (and related guidance from Visa), the HITECH Act, state privacy laws, and other regulations have specifically referenced NIST for data encryption and key management standards.

Companies and organizations acquiring data protection technologies should look carefully at how solutions match up to the standards. And a word of warning here: There is still a lot of snake oil in the data protection industry. Be sure that your data protection vendor can prove that their solutions actually meet the NIST standards. This is actually not hard to independently verify – NIST publishes on-line lists of vendors who certify their solutions to the standard.

Encryption is a well defined technology to protect data through the use of an encryption algorithm and secret key. When combined with proper key management, encryption provides a well accepted method of protecting sensitive data. There is a long history of work by professional cryptographers and NIST on defining how good encryption and key management should work, and you can easily determine which vendors meet the standard through the certification process.

Tokenization is a new technology and lacks the history and standards of encryption, but which incorporates encryption technologies. Tokenization works by substituting a surrogate value (or “token”) for the original data. By itself the token does not tell you anything about the original value which might be a credit card number, patient ID, and so forth. But tokenization requires that you use good encryption practices to protect the sensitive data in the token database. This means you have to use a tokenization solution that meets the same stringent standards for encryption and key management. When acquiring a tokenization solution, you will want to use the same diligence about encryption and key management that you would use for a pure encryption solution – that is, the solution should be built to standards and the vendor should be able to prove it through the NIST certification process.

Remember, a tokenization solution will be IN SCOPE for a PCI audit!

Tokenization standards are still evolving. Bob Russo of the PCI Security Standards Council indicated that the council will be taking up work on this in 2010. Visa just released a best practices guide for tokenization (you can get it here), and you can probably expect the eventual standards to incorporate much of this guidance. Additionally, the X9 organization is also working on standards for tokenization.

In regards to tokenization standards, stay tuned ! Much more is coming our way.

Encryption, tokenization, and key management – this is the trifecta for protecting data at rest. I’ll have more comments in the future about tokenization as we analyze the best practice guidance from Visa and help you connect the dots with our encryption, tokenization, and key management solutions.

Patrick

Topics: Encryption, NIST, Key Management, PCI DSS, tokenization

PCI DSS 2.0 and Encryption Key Management

Posted by Patrick Townsend on Feb 11, 2011 11:46:00 AM

2014 UPDATE:
No Significant Changes around Key Management in PCI DSS v3.0

PCI DSS 2.0 encryption

The new PCI Data Security Standards (PCI DSS v2.0) are here and I’ve gotten a lot of questions about the changes related to encryption key management. Because we work with a lot of companies going through PCI compliance audits and reviews, the new standards just confirm the trends we’ve seen over the last few months on how QSA auditors and security professionals view encryption key management, and what they see as the minimum requirements for managing keys.  The clear trend is to require that encryption keys be stored separately from the data they protect, and to make sure that the people who manage encryption keys are not the people who manage the protected data. Let’s look at why this is happening.

PCI DSS Encryption Key Management ComplianceWhile most of the largest merchants in the Level 1 category are already using professional key management solutions to protect encryption keys, the trend over the last 12 months is to require smaller merchants in the Level 2 and Level 3 categories to also use better key management practices, too. So, what are the parts of PCI DSS that are driving this change?  It all has to do with industry best practices for encryption key management, and the concepts of Dual Control, Separation of Duties, and Split Knowledge. These best practices and concepts work together to form the basis for determining if your approach to key management will pass muster.

First, what is the source of industry best practices for key management? Here in the US, the National Institute for Standards and Technology (NIST) is the most common source for guidance on best practices. The NIST special publication SP-800-57 provides specific pointers on how best practices for both procedurally managing encryption keys, and what to look for in key management systems. In these documents you will find the genesis of most standards regarding encryption key management, including the concepts in PCI DSS 2.0 Section 3.

Next, it is important to understand Dual Control, Separation of Duties, and Split Knowledge. These are all clearly defined in the PCI DSS standard and in the accompanying PCI DSS glossary. I’ve extracted the exact definitions below, but I’ll recap them here from the point of view of key management.

Dual Control means that no one person should be able to manage your encryption keys. Creating, distributing, and defining access controls should require at least two individuals working together to accomplish the task.

Separation of Duties means that different people should control different aspects of your key management strategy. This is the old adage “don’t put your eggs in one basket”. The person who creates and manages the keys should not have access to the data they protect. And, the person with access to protected data, should not be able to manage encryption keys.

Split Knowledge applies to the manual generation of encryption keys, or at any point where encryption keys are available in the clear. More than one person should be required to constitute or re-constitute a key in this situation.

What are the practical implications of these best practices and core concepts?  One of the practical implications follows from a common fact of system administration. On all major operating systems such as Linux, Windows, and IBM System I (AS/400) there is one individual who has the authority to manage all processes and files on the system. This is the Administrator on Windows, the root user on Linux and UNIX, and the security officer on the IBM System i platform. In fact, there are usually multiple people who have this level of authority. In one study by PowerTech, the average IBM System i customer had 26 users with this level of authority!

That’s why storing encryption keys on the same system where the protected data resides violates all of the core principles of data protection, and that’s why we are seeing auditors and payment networks reject this approach. If you haven’t faced this issue yet, your day is probably coming. Now is the time to start planning on how to deal with the problem.

Over two years ago we saw this trend developing and took action to help merchants be prepared for proper key management. We created the Alliance Key Manager solution and released it to our partner channel in 2009. This year we released it for direct sale, and last week we received our FIPS-140-2 certification from NIST. Over 1,000 customers are now using AKM to protect their encryption keys with a solution that provably meets industry standards.  Our encryption products have been updated to use this new key management solution, and we are moving customers forward to compliance. It’s been a long, hard slog to NIST FIPS-140 certification, but I think our customers will benefit from the effort.

I hope this has been helpful in clarifying key management best practices. For more information on PCI and key management, download our podcast titled "Key Management Best Practices: What New PCI Regulations Say." Please let us know if you have any questions.

Click me

---

From the PCI DSS version 2.0 Glossary:

Dual control
“Process of using two or more separate entities (usually persons) operating in concert to protect sensitive functions or information. Both entities are equally responsible for the physical protection of materials involved in vulnerable transactions. No single person is permitted to access or use the materials (for example, the cryptographic key). For manual key generation, conveyance, loading, storage, and retrieval, dual control requires dividing knowledge of the key among the entities. (See also Split Knowledge).”


Separation of Duties
“Practice of dividing steps in a function among different individuals, so as to keep a single individual from being able to subvert the process.”

Split knowledge
“Condition in which two or more entities separately have key components that individually convey no knowledge of the resultant cryptographic key.”

Source documents are available online at www.pcisecuritystandards.org

Topics: Compliance, Encryption, Key Management, PCI DSS, PCI, FIPS-140

Blackberry, Key Management, and Message Security

Posted by Patrick Townsend on Feb 11, 2011 11:44:00 AM

blackberry securityMany of us have been watching the on-going drama between RIM (makers of the ubiquitous Blackberry) and various governments around the world. Governments have been successfully pressuring RIM to provide access to their internal messaging servers in order to get access to encrypted messages sent and received by Blackberry users. I think RIM has been trying to fight this access as best they can. After all, one of their key product messages is around the security of their systems. In spite of that I suspect some governments have been successful in getting at least limited access to the Blackberry servers that process secure messages.

At first I was puzzled by this story when it started to emerge. I mistakenly thought that the private key needed to decrypt a message was stored on the receiver’s Blackberry and that the intermediate message servers would not have the key necessary to decrypt a message. I was apparently wrong about this architecture and it turns out that the Blackberry message servers do have the ability to decrypt messages in transit. That ability puts RIM in the uncomfortable headlights of law enforcement and security agencies around the world.

People have been asking me if a similar situation exists with other common encryption technologies. For example, when I encrypt a file with PGP can it be decrypted by someone (A government? A credit card thief?) before it reaches the intended recipient. Before the drama with RIM I was not hearing this question, but now I think many people are wondering about it.

The short answer is to the question is No: When you encrypt a file with PGP it is not possible to decrypt it before it gets to the intended recipient. PGP is based on the widely used public/private key encryption technology deployed in many secure systems such as VPNs, web browsers, and secure FTP. When I encrypt some information with a public key, only the person holding the private key can decrypt the information. As long as I protect my private key an intermediary can’t decrypt a message intended only for me. Almost all of our assumptions about security depend on this fact.

Is this system perfect? No. As a recipient of secure messages I may inadvertently disclose my private key or lose it by failing to protect it properly. Also, I may be legally compelled by a government agency to relinquish it. Many governments are now requiring people to disclose their private keys and passwords when ordered by a court to do so. You might think that you can’t be compelled to give up a password or private key, but I think that resolve might fade after a few days of sitting in a jail cell. The bottom line is this: public/private key technology is the best method we have of protecting sensitive information. When done well it prevents anyone but an intended recipient from reading the sensitive information. But it also means that you have to pay attention to how you manage and protect encryption keys. Proper encryption key management is essential to any data protection method you use. We’ll be talking more about this in the days ahead.

Patrick

Topics: security, Key Management, public/private key, Blackberry/RIM, PGP

Blog-CTA-VMware-CSP
 
The Definitive Guide to AWS Encryption Key Management
 
Definitive Guide to VMware Encryption & Key Management
 

 
 

Recent Posts

Posts by Topic

see all