Townsend Security Data Privacy Blog

Patrick Townsend

Recent Posts

Microsoft WPC 2011: SQL Server Encryption and the Cloud

Posted by Patrick Townsend on Jul 14, 2011 8:00:00 AM

microsoft wpcMicrosoft’s Worldwide Partner Conference just wrapped up and it was truly an International conference. There were partners from every corner of the world. Microsoft has invested a lot in this conference and they are doing a great job of helping companies meet new partners through the on-line WPC Connections web site.

It is clear to me that Microsoft is converging a wide range of products onto the SQL Server platform for data management. The many business applications under the Microsoft Dynamics label including Dynamics NAV (the ERP system), Dynamics CRM (customer relationship management), and Dynamics AX (global ERP management) are all based on the latest version of SQL Server.  The very popular SharePoint collaboration tool now fully supports and exposes SQL Server Enterprise edition. All of the Business Intelligence solutions have been based on SQL Server for some time. And this pattern repeats through other products.

Why is this important? In SQL Server 2008 Microsoft introduced a new database security architecture with Extensible Key Management (EKM). EKM enables database encryption and the use of Hardware Security Modules (HSM) to store and manage encryption keys. Encryption and good key management are crucial to regulatory compliance, and the EKM architecture makes this all possible. The EKM architecture extends forward to the new version of SQL Server code named Denali.

You will be hearing more from Townsend Security about Microsoft SQL Server encryption key management next month.

The Cloud is the other big topic at this conference. Microsoft is moving almost everything to the Cloud at full speed. Microsoft Dynamics, SharePoint, SQL Server, and many other products are getting Cloud-based versions. Microsoft may be a bit late to the game on the Cloud, but they are “all in” now. And they’ve made a lot of room for partners to play in this arena, too. There are a really large number of new and existing Cloud providers at WPC.

Of course, the biggest concern on the part of end customers is the security of the Cloud. After many discussions with Microsoft partners, I know that they have heard this concern. But there is still quite a bit of confusion and ignorance about how to mitigate risk in Cloud environments. I can see we have our work cut out for us in helping to educate the Microsoft partner community about how they can use our solutions to encrypt and protect customer data. It won’t be as hard, painful, and expensive as they think.

Be sure to follow us on Facebook, Twitter, and LinkedIn to hear more about what we are doing with Key Management for SQL Server 2008 and  stay up to date on the latest trends in data protection.

Patrick

 

facebook  twitter  linkedin

Topics: Microsoft, SQL Server, Worldwide Partner Conference

Tokenization & Encryption: The Data Protection Package

Posted by Patrick Townsend on Jul 12, 2011 8:00:00 AM
encryption and tokenizationAs I talk to Enterprise customers I’m finding a lot of confusion about when to use encryption or tokenization, and how to think about these two data protection technologies. Once you understand how each of these technologies work, you understand that there are no easy answers to which is best for you, or when one is better than another. I want to talk about some general guidelines I’ve developed to help with this conundrum.

Encryption is well known technology, has clear standards, and has been in use for data protection for a long time. Most of the compliance regulations (PCI, HIPAA/HITECH, state privacy regulations, etc.) make clear reference to encryption and widely accepted standards. So it is a no-brainer to use encryption. When done correctly, it is going to meet compliance regulations and your security goals.

But encryption has a nasty side-effect. When you encrypt fields in your database that are indexes or keys, you disrupt the indexing capability of the field and you introduce unacceptable performance burdens on your system. Encrypting an index or key field often means re-engineering the application and this is costly and time consuming.

Enter the new kid on the block – Tokenization.

When you tokenize data you replace the sensitive data with a surrogate, or token, value. The token itself is not sensitive data, but it maintains the characteristics of the original sensitive data. It walks like a duck, it quacks like a duck, but from a compliance point of view, it is NOT a duck. Tokenizing data lets you maintain those precious index and key relationships in your databases, and minimizes the number of changes you have to do to your applications.

So, why not use tokenization for everything? Is this the magic bullet we’ve been searching for?

Hold on there, Cowboy. There are some things you should think about.

Tokenization solutions typically work by creating a separate database to store the token and the relationship to the original sensitive data. This means that every time you need to register a new token, retrieve the attributes of the token, or recover the sensitive data, you have to make a request to the tokenization solution to do this work for you. Got 10 million records in your database? This is going to have a major impact on performance. Applications that need high performance may not be the best for a tokenization approach – you might really want to use encryption in this environment.

Then there is the question of compliance. Tokenization is new technology. At this point there are no standards for tokenization solutions, and no reference to tokenization in the published regulations. So, are you really compliant if you tokenize sensitive data? I think so, but you should be aware that this is an unsettled question.

When you tokenize data, you are creating a separate repository of information about the original sensitive data. In most cases you will probably be using a solution from a vendor. Since the tokenization solution contains sensitive data, it will itself be in scope for compliance. Has the vendor used encryption, key management, and secure communications that meet compliance regulations? How do you know? If you are going to deploy a tokenization solution you will want to see NIST certification of the solution’s encryption and key management so that you are not just relying on the claims of the vendor.

Most Enterprise customers will probably find uses for both encryption and tokenization. Encryption is great for those high-performance production applications. Tokenization is great for maintaining database relationships, and reducing risks in the development, test, QA and business intelligence databases. Both can help you protect your companies’ sensitive data!

For more information on tokenization, view our recorded webcast titled "Tokenization and Compliance - Five Ways to Reduce Costs and Increase Security."


Click me

Topics: Compliance, Encryption, tokenization

Merchants and Smart Phone Payments – PCI Shields Up

Posted by Patrick Townsend on Jun 27, 2011 8:33:00 AM

smartphone paymentSmart phone payment systems have exploded over the last few months offering the specter of turning every street vendor into a walking, talking, credit-card accepting, free-spirited merchant. In some cases smart phone payment vendors are giving away free card readers just by signing up. Some people saw this as the welcome exuberance of democratic capitalism with innovation driving new opportunities. Others saw this as the apocalypse for credit card security.  Is there some middle ground here?

Last November the PCI Security Council took the unprecedented step of suspending all Payment Application Data Security Standard (PA-DSS) certifications of smart phone payment applications, and refused to accept new applications based on that platform. That sent a signal to all established merchants that the council had serious concerns and would be issuing new guidance. Established vendors of payment solutions were left in limbo, and new startups who wanted into this field found themselves stalled. Nerve wracking to say the least.

Today the PCI Security Council removed some of the uncertainty about smart phone payment systems by issuing some preliminary guidance.  You can read the press release here.

In the press release you can find initial guidance on what smart phone applications might qualify for PA-DSS certification, and which applications will likely be excluded from the process.  You can find that guidance here.

This guidance is bleak for almost all of the smart phone applications currently in the market.  In regards to applications that will NOT be considered for certification, this item stands out:

 

13. Does the application operate on any consumer electronic handheld device (e.g., smart phone, tablet or PDA) that is not solely dedicated to payment acceptance for transaction processing?

 

Does this mean that a merchant can’t use a smart phone application for payment processing? Nope, the council addresses that, too. If you want to accept payments using smart phone technology, you must include the smart phone application as a part of your normal PCI DSS review process. So, as a merchant, that is your path to PCI compliance. But I don’t think it is time to pop the cork quite yet.

A normal PCI DSS review looks at all of the systems that process credit card information, and the systems they connect to. A payment application that has connectivity to another system generally puts that system into scope for PCI compliance. Your average smart phone connects to hundreds of millions (billions?) of servers on the Internet. That’s a scope of compliance from your worst nightmare.  So I don’t think we will see a rush by merchants to include smart phones in their PCI DSS plans.

Where is all of this going?

I think some smart phone vendors will move towards dedicated devices for payments. Some vendors of WiFi payment systems may incorporation solutions based on the less expensive smart phone platform. We might also see the emergence of alternative payment technologies that don’t directly involve credit card swipes (think something like PayPal?). Things look bad for those smart phone payment applications that got into the market early. If you are the Balloon Man working the local flea markets, don’t throw away that donation hat just yet.

For more data privacy news and tips, follow us on Facebook, Twitter, and LinkedIn.

    
Facebook Townsend Security  Twitter Townsend Security  LinkedIn Townsend Security

Topics: PCI DSS, Smart Phones

FIPS-140 Certified Encryption and the "Aha" Moment

Posted by Patrick Townsend on Jun 16, 2011 8:28:00 AM

I believe that every individual or company that attempts to bring encryption products to market experiences an “Aha” moment. This is the moment when you realize how very difficult it is to get encryption right, and how many ways there are to get it wrong.

It’s not just that encryption is complicated (it is; to a non-mathematician the algorithms can be mind-boggling). It’s that there are so many aspects to doing an encryption implementation correctly that the likelihood of errors is high even for the best-intentioned and most knowledgeable developers. This “Aha” moment can be dramatic. It happens when you see all of your limitations clearly and you know that you are facing a crucial challenge.

However, what a person or company does after this “aha” moment says everything about their character and the quality of the products they bring to market.

NIST EncryptionWhen I had this “Aha” moment years ago, I realized that our company had to radically change how we approached the development of our encryption and key management products. I knew that we had to step up to much higher standards, and change how we looked at our own products. But where does one go to figure out how to do encryption right? Fortunately, our company had several good enterprise customers who helped point the way. Enterprise security architects directed us to the National Institute of Standards and Technology (NIST) web site and the FIPS-140 certification process. The NIST and FIPS-140 certification outline the proper standards and best practices for encryption, decryption, key management, and logging.  So began the complete transformation in how we bring Townsend Security encryption products to market.

It wasn’t long, however, before the “Aha” moment was followed by an “Oh no” moment.

It quickly became clear that there was a large body of published guidance on the standards and best practices for encryption and key management. This stuff would fill a small library. And it was intense reading. This was not “Dick and Jane” beginning level stuff. It assumed that you started at a pretty advanced point in your knowledge of encryption and cryptographic module implementation. Not only are there published standards, but there are well-defined test and certification protocols.  And these tests were not going to be easy to pass. These tests are only conducted by a small number of certified labs (See NVLAP), the tests are detailed, complex, and designed to detect even the most minute errors that could cause encryption algorithms to fail.  Certification also means that you must undergo a stringent review of the encryption source code and your development practices.

This was the “Oh no” moment. This process was going to hurt.  It was going to be expensive, time consuming, and mentally taxing. And (at least initially) it was going to slow down our release schedule and increase our time to market.  There was also the concern that some competitors would rush to market faster with whiz-bang features that impressed customers in the demonstration process, but were of less importance to encrypting data.

This was going to be a huge undertaking.  I huddled with our development team. I huddled with our sales and marketing team. I took a long walk.

It was clear to me that this was a decision that would define who Townsend Security would be as a company, and it would illuminate how we really feel about taking care of our customers.  Were we really committed to doing security right and providing complete solutions to our customers?  Or were we willing to scrape along the bottom with inferior products that could be sold to less sophisticated customers?

FIPS certifiedWell, you already know how this came out. In the end I could come to no other conclusion.  We would either do the right thing, or get out of the security market altogether. We’re still in, so you know that we made that commitment and investment in NIST certification of our correctly implemented encryption solutions. We did learn a lot about encryption development processes and best practices. And I must say our products are so much better for it.

As you know, we made a substantial investment in the certification effort (we still do), and we do have some competitors, especially in the IBM System i (AS/400) marketplace, who claim to have jumped ahead of us.  But I know how dramatically our certification efforts improve our products, and I know how much better off our customers are because of it. Customers who have a NIST certified solution will be protected from harsh regulations, those who put their trust in non-certified solutions will find themselves at the mercy of ever evolving regulating standards.  As these compliance regulations evolve and incorporate standards and independent assessment in their guidelines, our customers will benefit from our efforts. And as the attacks on our protected data get ever more sophisticated, we will see poorly crafted encryption and key management products easily broken with heartbreaking losses for the companies involved.

So, I am a converted believer in the independent certification process.  No one believes that independent NIST certification is a guarantee of perfect security. But no one who has been paying attention believes anymore that a security product should be trusted without it.  We believe the encryption and key management you trust to protect your entire Enterprise database should be equally (if not more stringently) proven and validated.  Click here download a free 30-day evaluation of our NIST-certified AES encryption - available for all Enterprise platforms (Windows, UNIX, Linux, IBM i, IBM z).

Patrick

Click me

Topics: Encryption, NIST, FIPS-140

Five Ways to Help Your Company Prevent a Phishing Attack

Posted by Patrick Townsend on Jun 13, 2011 1:59:00 PM

phishingAs you probably know “Phishing” is the security term used for email that looks perfectly valid, but which contains links or attachments that can infect your PC. Really good phishing email looks like it came from someone you know, or from a business that you work with and trust. A well-crafted Phishing scheme lowers your defenses. You say to yourself, “I’m glad John got back to me on that financial plan.” Or, “I wonder why Wal-Mart is having trouble with my invoice.” And a click or two later and you’ve fallen victim to a phishing attack.

Sometimes you know right away when you’ve fallen victim. Your PC goes bonkers or acts oddly and perhaps disturbing messages appear. However, the worst infections can go undetected for a long time. The malware may be snooping for your on-line banking account password, or trying to steal other valuable information. These are probably the worst types of malware infections as you don’t know you are infected.

Small and mid-sized businesses are now under increasing attack from this type of security threat. Organized criminals are looking at these companies as more vulnerable and easier targets. They may have smaller bank accounts, but it may be easier to drain them. So don’t think being a small company will not make you a target.

Here are some thoughts on simple things you can do:

  • Be sure all of your PCs and Macs are running the latest anti-virus protection software. Nothing should be connected to your network that does not have the best possible protection.
  • Be sure you use strong and unique passwords for financial accounts. We human animals like to minimize the number of complicated things we have to remember. If you use the same password for Facebook and your company bank account, you are in a lot of danger.
  • If you are a small company, consider dedicating a small laptop to do your on-line banking. You could load Linux (Ubuntu is my favorite) and a web browser like Firefox, and only use the laptop for that one function.
  • Use two-factor authentication for all of your high value transactions. The better banks will help you implement this, and it is one thing that can be helpful.
  • Be sure to remind your colleagues on a regular basis to be careful. Being alert is one of the strongest deterents.

One of the biggest mistakes you can make is to feel you are immune from this type of attack. Those of us who work in IT or in the security area begin to think we are bullet-proof. Not so! I found myself shocked recently after clicking on a Facebook posting that looked like it came from my daughter, and watching Microsoft Security Essentials quarantine a nasty virus. My shields were down and I suffered an attack. But this is the characteristic of a really good phishing attack. You relax into a state of trust right at the wrong time.

Now, where’s that email from my new business partner in Nigeria?

For more data privacy news and tips, follow us on Facebook, Twitter, and LinkedIn.

facebook  twitter  linkedin

Topics: Data Privacy, Best Practices

Encrypted USB Drives Hacked: What Went Wrong?

Posted by Patrick Townsend on Jun 7, 2011 8:30:00 AM

I’ve always liked those Holiday Inn Express commercials with the theme of “Stay Smart.” The commercials portray an “expert” stepping in to save the day. In one, a “nuclear expert” takes charge of a reactor about to melt down. In another a “doctor” arrives to deliver a baby just in the nick of time. The tag line is funny because the so-called expert turns out not to be a nuclear scientist or a doctor, but just an average person who stayed at a Holiday Inn Express. That made them “smart.” Don’t worry; I’ll bring this discussion back to encryption momentarily.

A while back, reports surfaced of broken encryption security for some Kingston, Verbatim, and SanDisk secure USB storage devices. Not all of the vendor’s devices were affected, but some of their most popular products were. All these products were NIST-certified, causing some industry commentators to erroneously question the certification process. Being a big believer in independent certification, I’d like to weigh in on this controversy and set the record straight.

encryption keysAs it turns out, the weakness, in these devices, was not in the actual AES encryption, but in the key management processes. All the affected vendors quickly released replacements or patches to fix the problem, which is the right thing to do. But it was fascinating to watch some of the responses to this problem. Many commentators complained that the FIPS-140 testing was faulty, or that FIPS-140 testing was irrelevant. The implication is that FIPS-140 does not really give you any assurance of security, and therefore, also by implication, that it is not important.

This is really the wrong conclusion. Let me talk a little about FIPS-140 certification and what is does mean.

FIPS-140 ValidationFirst, FIPS-140 certification is not a guarantee of security. It is an assurance that encryption and related security algorithms have been implemented in compliance with published standards, that an application uses good practices in exposing it’s operational interfaces, that start up tests validate that the application has not been modified or corrupted, that cryptographic material is not exposed in application logs or leaked to memory, and that an independent expert has reviewed the source code. Going through a FIPS-140 certification is a grueling process for an encryption vendor and almost always results in finding some issues that need to be addressed to make the product more secure. Companies that engage in FIPS-140 certifications produce better products, and become better security designers in the process.

Is the FIPS-140 testing and certification process perfect? Of course not. That’s not a standard anyone can meet. In fact, NIST is working on a project right now to enhance the process. Believe me, the new certification process (probably to be named FIPS-140-3, for version 3) will not be weaker than the current process, it will be better.

The lesson from the encrypted USB problem is not that FIPS-140 certification is meaningless. It’s that doing encryption right is really difficult. If you want a secure USB storage device, you would NEVER consider using a product that was not FIPS-140 certified. We have plenty of experience of broken security on non-certified products. Problems with certified products are rare, but do happen. Usually you will find that a problem with a FIPS-140 certified product is with some aspect of the application that was out of scope for the certification. That’s the case for the encrypted USB devices that had problems.

To bring us back full circle, I just want to say that no responsible Enterprise should trust a non-certified USB device, anymore than you or I should trust a “doctor” to perform surgery because that “doctor” stayed at a Holiday Inn Express. The sad fact is that many large corporations today are putting their trust in encryption vendors who have not FIPS-140 certified their products. The management of these companies would never consider using a $100 secure USB device without certification, but do entrust the protection of huge amounts of sensitive data to non-certified vendors. In this age where too many try to pass themselves off as experts, it often takes an organization like NIST to certify the expertise behind something as important as encryption.

I’m proud of our NIST certifications – we will never back down from our commitment to provide you with the best security products and our commitment to independent certification. You can learn more about FIPS-140 certification on our web site, or directly from NIST at www.NIST.gov.

For more information, download our white paper "AES Encryption and Related Concepts" and learn about how proper encryption and encryption key management work together to secure your data.

Click me

Topics: NIST, Encryption Key Management, AES

Epsilon Data Breach - More Serious Than You Think

Posted by Patrick Townsend on May 17, 2011 12:00:00 AM

epsilon breachI found the data breach of Epsilon just shocking for several reasons:

First, the scope of the breach was astounding. About 2,500 companies are using Epsilon for email communications with their customers, and some of these companies are quite large. Thus the number of email addresses exposed was gigantic. You really have to wonder why those email addresses weren’t encrypted. Anyone would see those email addresses as a high value target. And email addresses are Personally Identifiable Information (PII), after all.

Second, you have to wonder why really large companies trusted Epsilon with their customer information without insisting on good data protection practices.  What were they thinking? When you hand over your data to an outside company, you aren’t off the hook if there is a data loss.  It wasn’t Epsilon who had to send emails and letters to customers. The originating companies bear the cost of that effort, and the business damage that follows.

Third, the loss of an email address is not trivial. It’s true that email addresses are more public than many bits of personal information we have. But email addresses are often used as account identifiers for on-line services. If I have your account ID it is a lot easier to attack your password credential. People are amazing lax about creating strong passwords. So the loss of emails provides one more weak link in the chain of security for individuals.

Then there are the phishing attacks. If I have your email address it is a lot easier to send you an infected PDF file. I just look on your company’s web site or Facebook page and find the name of your CEO. Then I send you an email with the CEO’s name and an infected PDF. Perhaps I name the PDF “Look at these terrible results!.pdf”. You are probably going to jump to open that one!  So now I have invaded your internal network.

You can see how this can really escalate to bad news for you and your organization.

The lesson for any organization is to do some due diligence with your service providers. Be sure they are protecting your information with the same level of care that you do. After all, you are on the hook if they lose your data.  For more information, download our white paper titled AES encryption and Related Concepts.

 

Click me

Topics: Encryption, Phishing, Data Breach, Personally Identifiable Information (PII)

Security in the Cloud

Posted by Patrick Townsend on May 5, 2011 9:37:00 AM
securing the cloudWe've been tracking the growing need for encryption and key management to secure the mass of data that is (or soon will be) residing in the Cloud. To address this issue, a security group was recently formed that is completely focused on Cloud security. If you’ve not visited the Cloud Security Alliance web site, it is well worth a visit at www.cloudsecurityalliance.org.

The alliance has attracted top tier talent in the security and audit communities, and has published guidance on issues that should concern anyone considering deploying Cloud solutions.

The guide covers three basic models of cloud deployment – IaaS (Infrastructure as a service), PaaS (Platform as a Service), and SaaS (Software as a Service). It goes on to discuss the necessary differences to approaching security in the Cloud. It’s a nicely done, high-level guide to security in the cloud.

Section 11 in the guide is on encryption and key management, which is the focus of our company and products. Their recommendations on encryption are spot-on. Because of co-tenancy and shared resource management on cloud platforms, security professionals recognize that there is an elevated risk of loss. Cloud users need to take extra steps to protect sensitive information. Encrypt data in motion, even between different applications and environments on the same cloud; Encrypt data at rest and in archival storage; Encrypt data on backup media and insure that you have access to the encryption keys in a non-cloud environment.

The recommendations on key management are also very interesting. The alliance has recognized that weak key management is much more of a problem in Cloud environments. Here is a sample and summary of some of their recommendations (you can get the full report at their web site):

Key stores must themselves be protected in storage, transit, and backup. Encryption keys should never be stored in the clear, and keys should never be stored on the platform where they are used.
Access to keys should be controlled, and the users of encryption keys should not be the ones storing and managing the keys. This means you should never use native operating system account management as the access control mechanism for key management.

Secure backup and recovery of key management systems is more important. There are special requirements for backing up key management systems.

Segregate key management from the cloud provider to avoid conflicts in the event of legal disclosure requirements. This will be a real challenge for companies that use Clouds for substantially all of their operations.

Insure that encryption adheres to industry and government standards. Of course, the only way to insure adherence to standards is to insist on NIST certification of encryption and key management solutions. For example, FIPS-140 certification should be a requirement for a key management solution.

These are just some of the recommendations in this important guidance. If you are considering the Cloud as a home for your applications and systems, this guide is definitely for you.

For further information, we have produced a podcast titled Key Management Best Practices: What New PCI Regulations Say.

Click me

Patrick

Topics: security, cloud

SHARE Mainframe Conference 2011 and PGP Encryption

Posted by Patrick Townsend on Mar 9, 2011 7:53:00 AM
SHARE ConferenceIt was a great time of year to be in Anaheim, California last week for the IBM System z Mainframe SHARE user conference. The rains had just passed through and the weather was balmy. The Anaheim convention center is right next door to Disneyland, a place that was paradise to me growing up in Southern California.  The juxtaposition was not lost on anyone – Mainframes being the really serious computing platform, and Disneyland being the silliest and most fun place on planet Earth. But there was fun at the SHARE conference, too.

The death of the Mainframe has been predicted for years, but it keeps chugging along as one of the workhorses of large organizations. IBM has invested a lot in the hardware technology to keep it up to date, and you get a lot of bang for the buck with one of these systems. You can now even run Linux under z/VM and there are some really big installations of Linux on this platform.  All in all, it’s an impressive system.

I was at SHARE to support our partner, Software Diversified Services as they are now our distributor for PGP on the Mainframe z/OS platform. They are doing a great job of bringing this important encryption technology to IBM’s largest server system. People are often amazed at what you can do with PGP on the Mainframe. Create an Apple Mac self-decrypting archive on z/OS??? You have to be kidding, right? Nope, the PGP solution on the Mainframe creates self-decrypting archives for Windows, Mac, Linux, and flavors of UNIX. Also, it integrates with PGP Universal key server for key management. Another feature is that it compresses data up to 98 percent for encrypted data files. Additionally, it supports Mainframe file systems like PDS, Sequential, and VSAM. So PGP is an impressive offering for Mainframe customers who need to encrypt data for compliance. It was great to talk to the Mainframe customers who were approaching PGP with some trepidation. They were a lot more comfortable knowing that they could run PGP using normal JCL scripts.

With the customer base holding steady at between 6,000 and 7,000 customers worldwide, and with IBM continuing to improve the platform and make it more affordable, I believe it will be an important computing platform for years to come.  We’ll be seeing a lot more of Mainframes and Mickey Mouse for years to come.

Click here for a free evaluation version of PGP for the Mainframe.

Patrick

Topics: SHARE, Mainframe, PGP

Migrating to Alliance Key Manager with IBM i Native Encryption APIs

Posted by Patrick Townsend on Mar 7, 2011 11:10:00 AM
Key ManagementNow that the new version of the PCI Data Security Standard (PCI DSS version 2.0) is in effect, many IBM i (AS/400, iSeries) customers are getting dinged on their PCI compliance in the area of encryption key management. The renewed focus on "Dual Control" and "Separation of Duties" by QSA auditors is forcing many IBM i customers to move from homegrown key management to a better method of securing keys. This is even happening for IBM i customers who use IBM’s Master Key and key database facility. Why is this? There is just no way to properly implement effective security controls for the QSECOFR user, or for any user with All Object (*ALLOBJ) authority. Thus no "Dual Control" and no "Separation of Duties." And QSA auditors have figured this out.

Moving to good key management does not mean you have to completely change how you encrypt the data. And it doesn’t have to be a time consuming, laborious process. Many IBM i customers use the native IBM i encryption APIs to protect data. Let us show you how easy it is to implement our Alliance Key Manager solution in RPG code while maintaining your encryption approach.

When you use the native IBM i APIs you first create an encryption algorithm context, then a key context, and they you use these contexts on the call to the encryption or decryption API. If you are using the IBM Master Key facility and special key database, you pass additional parameters to the key context API. Before migrating to our Alliance Key Manager solution your RPG code might look something like this:

      * Create a key context
     C                   eval      myKey = 'some binary value'
     C                   eval      keySize = 32
     C                   eval      keyFormat = '0'
     C                   eval      keyType = 22
     C                   eval      keyForm = '0'
     C                   callp     CrtKeyCtx( myKey      :keySize :'0'
     C                                       :keyType    :keyForm :*OMIT
     C                                       :*OMIT      :KEYctx  :QUSEC)
       *
       * Now we call Qc3EncryptData or QC3ENCDT to encrypt some data
       * and pass it the key context field <KEYctx>

After you implement the Alliance Key Manager solution and the IBM i API to retrieve the key, your application code would look like this:

      * Get the key from Alliance Key Manager
     C                   eval      AKMName = 'SomeKeyName'
     C                   eval      AKMInstance = ' '
     C                   eval      AKMSize = 256
     C                   eval      AKMFormat = 1
     C                   callp     GetKey( AKMName       :AKMInstance
     C                                       :AKMSize    :AKMFormat
     C                                       :AKMKey     :AKMUsed
     C                                       :Expires    :LastChange
     C                                       :Reply)
      *
      * Now we can use the field <AKMKey> on the create of the key context
      *
      * Create a key context
     C                   eval      keySize = 32
     C                   eval      keyFormat = '0'
     C                   eval      keyType = 22
     C                   eval      keyForm = '0'
     C                   callp     CrtKeyCtx( AKMKey      :keySize :'0'
     C                                       :keyType    :keyForm :*OMIT
     C                                       :*OMIT      :KEYctx  :QUSEC)
       *
       * Now we call Qc3EncryptData or QC3ENCDT to encrypt some data
       * and pass it the key context field <KEYctx>. That code is unchanged.

Notice that you’ve added a few lines of code to retrieve the key from the key server, and then used the retrieved key to create the key context. For most IBM i customers this will be a very quick change involving just a few lines of code. If you’ve taken a common module approach to isolate the encryption code, this might mean changing just one or two applications on your system. If you are using the IBM i Master Key and key database facility, you will have one more step to re-encrypt the data using keys from the Alliance Key Manager server.

Pretty simple process. Not bad for a day’s work.

Of course, there are proper ways to manage and protect an encryption key that has been retrieved from a key server, but we won’t go into that here. I want to save that topic for another day as it applies to many different application environments.

I hope you’ve gotten the idea that good key management doesn’t have to be a difficult, scary process. We are helping customers get this done today, and you can get there, too.

Click here to learn more about Alliance Key Manager and request an evaluation today.

Patrick

Topics: IBM i, PCI DSS, Encryption Key Management