Townsend Security Data Privacy Blog

How Do I Encrypt Data and Manage Encryption Keys Using Java in Amazon Web Services (AWS)?

Posted by Patrick Townsend

If you are a Java developer you probably know that the Java language has full native support for AES encryption. You don’t need any third-party SDKs or add-ins to Java to use industry-standard, strong encryption. The standard Java APIs are based on industry standards and are very efficient. Don’t hesitate to use that built-in facility. You include it in your Java application like this:

import javax.crypto.Cipher;
import javax.crypto.spec.IvParameterSpec;
import javax.crypto.spec.SecretKeySpec;

Protecting Encryption Keys in AWS Encryption key management is another story. To implement good encryption key management you will need to turn to an enterprise key management solution and their Java library to make this happen. Our Alliance Key Manager for AWS solution provides a Java SDK to help you with encryption key use. The Alliance Key Manager Java SDK lets you easily retrieve an encryption key for use in your application, or alternatively to send data to Alliance Key Manager on a secure connection where the encryption or decryption task can be performed directly on the key server. This encryption service is helpful in situations where you don’t want to expose the encryption key in your application or server environment.

Many developers use the Java Keystore (JKS/JCEKS) facility for storing encryption keys. The Java key store is more a key storage facility rather than a key management facility and rarely meets compliance regulations for separating keys from the data they protect, providing for separation of duties, and dual control. If you are currently storing encryption keys in a JKS repository you may want to consider moving them to true key management solution like Alliance Key Manager.

One of the advantages of the Alliance Key Manager SDK is the built-in high availability failover facility. By using the Alliance Key Manager SDK in the event of a network or other failure you automatically fail over to a secondary HA key server in real-time. This means your application keeps running even though a network or system error prevents access to the primary key server.

The Java SDK for Alliance Key Manager includes all of the support needed to make a secure connection to the key server, retrieve an encryption key, access the encryption and decryption services on Alliance Key Manager, and perform other common functions. By using the SDK the Java developer can avoid writing all of the code needed to perform these tasks – the work needed to retrieve an encryption key is reduced to a few lines of code.  We think this is a big bonus for the Java developer and helps make their lives easier. And sample source code will really speed along the process.

Here is an extract of the sample source code showing the retrieval of an encryption key from Alliance Key Manager, an encryption of some plaintext, and the decryption of that ciphertext:

// Note: Full sample source available (this is just an extract)

import javax.crypto.Cipher;

import javax.crypto.spec.IvParameterSpec;

import javax.crypto.spec.SecretKeySpec;


import com.townsendsecurity.akmcore.AkmException;

import com.townsendsecurity.akmcore.AkmUtil;

import com.townsendsecurity.akmcore.AkmRequest;


import com.townsendsecurity.akmkeys.AkmKeyRequest;

import com.townsendsecurity.akmkeys.AkmSymKey;


// The AKM configuration file

String sCfgFile = "/path/jakmcfg.xml"


// Create a key request object initialized from the configuration file

AkmKeyRequest keyRQ = null;

keyRQ = AkmKeyRequest.getInstance(sCfgFile);


// Define the key instance (version) name

String sInstance = "some-name"


// Retrieve the encryption key from Alliance Key Manager

AkmSymKey symkey = null;

symkey = keyRQ.retrieveSymKey(sKey, sInstance);


// Create a context

EncryptDecryptCBC cryptor = new EncryptDecryptCBC(symkey.getKeyBytes());


// Let’s encrypt some plaintext

byte[] ciphertext = null;

ciphertext = cryptor.encryptSymmetric(plaintext.getBytes());


// Let’s decrypt the ciphertext

byte[] plainbuf = null;

plainbuf = cryptor.decryptSymmetric(ciphertext);

There is no charge for the Java SDK and all Alliance Key Manager customers have access to the Java SDK and sample code. AWS customers must register on the Townsend Security web site to get access to the Java code. You can do that here.

Meeting Best Practices for Protecting Information in AWS

Tags: Alliance Key Manager, Amazon Web Services (AWS), Encryption Key Management, Enryption

Related Posts Plugin for WordPress, Blogger...

How Can I Manage and Monitor Alliance Key Manager in Amazon Web Services (AWS)?

Posted by Patrick Townsend

Alliance Key Manager for AWS runs as a stand-alone EC2 virtual machine in the AWS cloud. This is an Infrastructure-as-a-Service (Iaas) implementation which means that the solution includes the operating system (Linux) and the key manager application all within the AMI and active EC2 instance. There are several ways you can manage and monitor the key server.

How to Meet Best Practices for Protecting Information in AWS by Stephen Wynkoop The server components of Alliance Key Manager, such as the network interface, firewall, system logging, backup, and other operating system components are managed through a secure web browser interface. The secure browser session does not provide for management of encryption keys, but lets you perform various common server management tasks.

The Alliance Key Manager Administrative Console is a PC application that provides a GUI interface for the secure management of encryption keys and key access policy. You can manage multiple key managers through a single console instance and you can check the status of the key manager.

The Linux operating system of Alliance Key Manager provides a number of other ways to manage and monitor the key server. You can set firewall rules to control which client systems are authorized to access the key server, and you can set up system log forwarding to your log collection server or Security Information and Event Management (SIEM) solution running either inside or outside of AWS. Actively monitoring your security systems like Alliance Key Manager is a security best practice. You can easily monitor Linux system logs, web logs, firewall logs, and other system logs and transmit them to your SIEM log collection server.

Alliance Key Manager also creates audit and diagnostic logs that you can forward with the native syslog-ng daemon within the key manager. Every user and administrative access to Alliance Key Manager is logged to the audit file and errors are logged to the error log file. Both of these files should be forwarded to you SIEM log collection server.

Alliance Key Manager also implements a “NO-OP” command to provide for monitoring the up/down status of the key server. Your monitoring solution can periodically issue the No-Op command to the key server to determine the status of current key operations. The No-Op command is a lightweight transaction and you can safely monitor the status of the key server every few seconds if desired.

Many of our customers ask us if they can install third-party management and monitoring applications. In the past we’ve been restrictive about the installation of third party components, but we came to realize how important they are to many AWS customers. We now allow you to install these applications with the caveat that you must be responsible for their secure installation and deployment. Our customers are now installing applications like Chef and Nagios for active monitoring.

You should also be aware that Amazon provides a number of monitoring tools that you can use with any EC2 instance. One of the most common AWS monitoring tools is the Amazon CloudWatch.

You can use the CloudWatch facility to monitor the status of your Alliance Key Manager EC2 instance. This can help with early detection of potential problems.

Lastly, Alliance Key Manager is an API-driven enterprise key management solution. That is, all key management tasks that are performed from the Administrative Console can be performed from user applications or from the command line. In fact, the Administrative Console is built on these APIs. You can create your own applications that drive these functions without user intervention if you need to. This facility is very helpful for our partners who need to embed automated key management into their own AWS solutions.

You can find more information about Alliance Key Manager for AWS here.

How to Meet Best Practices for Protecting Information in AWS by Stephen Wynkoop

Tags: Alliance Key Manager

Related Posts Plugin for WordPress, Blogger...

IBM i, PCI DSS 3.2, and Multi-Factor Authentication

Posted by Luke Probasco

With the recent update to the Payment Card Industry Data Security Standard (PCI DSS) regarding multi-factor authentication (also known as Two Factor Authentication or 2FA), IBM i administrators are finding themselves faced with the requirement of deploying an authentication solution within their cardholder data environments (CDE). Prior to version 3.2 of PCI DSS, remote users were required to use two factor authentication for access to all systems processing, transmitting, or storing credit card data. With version 3.2 this is now extended to include ALL local users performing administrative functions in the CDE.

Here is an excerpt from section 8.3: (emphasis added)

8.3 Secure all individual non-console administrative access and all remote access to the cardholder data environment (CDE) using multi-factor authentication.

IBM i, PCI DSS, & Multi-Factor Authentication I recently was able to sit down with Patrick Townsend, Founder & CEO of Townsend Security, and talk with him about PCI DSS 3.2, what it means for IBM i users, and what IBM i users can do to meet the latest version of PCI DSS.

Thanks for taking some time to sit down with me, Patrick. Can you recap the new PCI-DSS version 3.2 multi-factor authentication requirement? This new requirement seems to be generating a lot of concern.

Well, I think the biggest change in PCI DSS 3.2 is the requirement for multi-factor authentication for all administrators in the cardholder data environment (CDE).  Prior to 3.2, remote users like contractors and third party administrators, had to use multi-factor authentication to login to the network.  This update extends the requirement of multi-factor authentication for ALL local, non-console users.  We are seeing businesses deploy multi-factor authentication at the following levels:

  •      Network Level - When you first access the network
  •      System Level – When you access a server or any system with the CDE
  •      Application Level – Within your payment application

The requirement for expanded multi-factor authentication is a big change and is going to be disruptive for many merchants and processors to implement.

Yeah, sounds like this is going to be a big challenge.  What does this mean for your IBM i customers?

There are definitely some aspects of this PCI-DSS update that will be a bigger challenge on the IBM i compared to Windows or Linux.  First, we tend to run more applications on the IBM i.  In a Windows or Linux environment you might have one application per server.  On the IBM i platform, it is not uncommon to run dozens of applications.  What this means is, you have more users who have administrative privileges to authenticate – on average there can be 60 or more who can sometimes be a challenge to identify!  When merchants and processors look at their IBM i platforms, they will be surprised at the number of administrators they will discover.

Additionally, the IBM i typically has many network services exposed (FTP, ODBC, Operations Navigator, etc).  The challenge of identifying all the entry points is greater for an IBM i customer.

You say it is harder to identify an administrative user, why is that?

On the IBM i platform, there are some really easy and some really difficult administrative users to identify.  For example, it is really easy to find users with QSECOFR (similar to a Windows Administrator or Linux Root User) privileges.  But it starts to get a little more difficult when you need to identify users, for example, who have all object (*ALLOBJ) authority.  These users have almost the equivalent authority of QSECOFR.  Finding, identifying, and properly inventorying users as administrators can be a challenge.

Additionally, with a user profile, there is the notion of a group profile.  It is possible for a standard user, who may not be an administrator, to have an administrative group profile.  To make it even more complex, there are supplemental groups that can also adopt elevated authority.  Just pause for a minute and think of the complex nature of user profiles and how people implement them on the IBM i platform.  And don’t forget, you may have users on your system who are not highly privileged directly through their user profile, but may be performing administrative tasks related to the CDE.  Identifying everyone with administrative access is a big challenge.

Townsend Security has a multi-factor authentication solution for the IBM i.  How are you helping customers deal with identifying administrators?

From the beginning, we realized this would be a problem and we have taken some additional steps, specifically related to PCI DSS 3.2 to help IBM i customers identify administrators.  We made it possible to build a list of all users who have administrative access and then require them to use multi-factor authentication when logging on.  We have done a lot to help the IBM i security administrator identify highly privileged users and enroll them into a two factor authentication solution, and then periodically monitor/update/audit the list.

What are some of the other multi-factor authentication challenges that IBM i customers face?

Some of them are pretty obvious.  If you don’t have a multi-factor authentication solution in place, there is the effort of evaluating and deploying something on the IBM i server.  You’ll find users who may already have a multi-factor authentication solution in place for their Windows or Linux environments, but haven’t extended it to their IBM i.  Even if they aren’t processing credit card numbers on the IBM i, if it is in the CDE, it still falls within the scope of PCI DSS.

Aside from deploying a solution, there is going to be administrative work involved.  For example, managing the new software, developing new procedures, and putting governance around multi-factor authentication.  Further, if you adopt a hardware-based solution with key FOBs, you have to have processes in place to distribute and replace them, as well as manage the back-end hardware.  It has been really great seeing organizations move to mobile-based authentication solutions based on SMS text messages where there isn’t any hardware of FOBs to manage.  Townsend Security’s Alliance Two Factor Authentication went that route.

Let’s get back to PCI DSS.  As they have done in the past, they announced the updated requirements, but businesses still have a period of time to get compliant.  When does PCI DSS 3.2 actually go into effect?

The PCI SSC always gives merchants and processors time to implement new requirements.  The actual deadline to meet compliance is February 1, 2018.  However, what we are finding is that most merchants are moving rapidly to adopt the requirements now.  When an organization has an upcoming audit or Self Assessment Questionnaire (SAQ) scheduled, they generally will want to meet the compliance requirements for PCI DSS 3.2.  It never is a good idea to wait until the last minute to try and deploy new technology in order to meet compliance.

You mentioned earlier that you have a multi-factor authentication solution.  Tell me a little bit about it.

Sure. Alliance Two Factor Authentication is a mature, cost-effective solution that delivers PINs to your mobile device (or voice phone), rather than through an expensive key FOB system. IBM i customers can significantly improve the security of their IBM i systems through implementation of proven two factor authentication.  Our solution is based on a non-hardware, non-disruptive approach.  Additionally, we audit every successful or failed authentication attempt and log it in the security audit journal (QAUDJRN).  One thing that IBM i customers might also be interested in, is in some cases, we can even extend their existing multi-factor authentication solution to the IBM i with Alliance Two Factor Authentication.  Then they can benefit from the auditing and services that we supply for the IBM i platform.  Our goal was to create a solution that was very cost-effective, rapid to deploy, meets compliance regulations, and doesn’t tip over the IT budget.

Download a podcast of my complete conversation here and learn more about what PCI DSS 3.2 means for IBM i administrators, how to identify administrative users, challenges IBM I customers are facing, and how Townsend Security is helping organizations meet PCI DSS 3.2.

Podcast: IBM i, PCI DSS, and Multi-Factor Authentication

 

Tags: 2FA, IBM i, PCI DSS

Related Posts Plugin for WordPress, Blogger...

How Can I Be Sure I Never Lose My Encryption Keys in Amazon Web Services (AWS)?

Posted by Patrick Townsend

As organizations move to the cloud, the topics of encryption and key management are top concerns.  "How can I be sure that I never lose my encryption keys?" is one that we hear a lot.  With Alliance Key Manager (AKM), Townsend Security's FIPS 140-2 compliant encryption key manager, you never have to worry about that! There are several layers of protection that help put this worry to rest. Let’s take a look at them in order.

Backup and Restore

Protecting Encryption Keys in AWS The first layer of protection is that Alliance Key Manager gives you a complete backup and restore facility -including both a manual and automated facility. At any time you can run the manual backup operation to back up your key database, certificates, configurations and access control definitions. This backup can be sent to your own secure server either in the AWS cloud or in your own data center. You can also create the backup image and download it directly to your own server for safekeeping.

Alliance Key Manager also supports the ability to automatically backup to a secure server at an interval you specify. You can back up your encryption keys daily, weekly, monthly or at an interval you specify. Secure off-line backup is the first layer of protection.

High Availability

Most of our customers in AWS will deploy a second instance of Alliance Key Manager as a high availability failover key server. You can deploy the HA instance of the key server in a different region, or even completely outside of the AWS cloud. Once you deploy the secondary HA instance of the AKM key server you can start mirroring your data keys from the primary production instance of the key server to this secondary HA instance of the key server. Keys and access policies are securely mirrored in real time and the mirror architecture is active-active. This means that if you fail over to the secondary key server, create keys or make changes to key access policies, these will be mirrored back to the production key server in real time. Key mirroring provides a second layer of protection from key loss.

For customers concerned about protection from failures of the AWS cloud platform itself, you can mirror encryption keys to a key server outside of the AWS cloud. That secondary mirror key server can be located in your data center, in another cloud service provider platform, or in a hardware security module (cloud HSM) in a hosting center. Note that there is no limit to the number of backup mirror key servers that you can configure. Alliance Key Manager supports a many-to-many architecture for key mirroring.

Export Encryption Keys

A third layer of protection is provided by the key export facility of Alliance Key Manager. You can securely export individual encryption keys to your own internal systems. The key export facility also provides you with the ability to share an encryption key with another user or organization.

Separation of Duties & Dual Control

Using Separation of Duties and Dual Control can provide a fourth layer of protection for encryption keys. This level of protection is especially helpful for protecting from insider threats. You can create a separate AWS account for use by your security administrators to create and manage encryption keys. These key management administrators would have no access to normal AWS instances where you store sensitive data, and your normal AWS administrators would have no access to the key management account. By activating Dual Control in Alliance Key Manager at least two security administrators need to authenticate to the server to make changes or delete encryption keys.

Stand-alone Instance

Lastly, Alliance Key Manager runs as a stand-alone EC2 instance in the AWS cloud. You are automatically taking advantage of the security, resilience and recoverability provided by Amazon. Always use good AWS account security and management practices to help protect your sensitive data and encryption keys!

It may theoretically be possible to lose an encryption key, but you are going to have to work very hard to do so! Alliance Key Manager takes the fear of key loss out of your encryption strategy in AWS.

You can find more information about Alliance Key Manager for AWS here.

Meeting Best Practices for Protecting Information in AWS

Tags: Amazon Web Services (AWS), Encryption Key Management

Related Posts Plugin for WordPress, Blogger...

Managed FTP Services on the IBM i – Look for These 8 Features

Posted by Patrick Townsend

In a previous blog I talked about the security features that you should find in a Managed FTP solution. Of course, we look for the security components first as we want to be very sure that our data is protected in transit and at rest when it arrives at its destination. But with the high volume of FTP transfer activity in the modern organization; we also want to find a number of automation and management features in our Managed FTP solution. That’s the focus of today’s blog.

Secure Managed File Transfer for IBM i Here are the eight main elements of a Managed FTP solution for the IBM i (iSeries, AS/400) platform:

  1. Automation

  2. Scheduling

  3. Application integration

  4. Diagnostic logging

  5. Notification and Exception handling

  6. Resource management

  7. File system support (DB2, IFS, etc.)

  8. Commands and APIs

Let’s take these areas one at a time.

Automation: By its nature FTP is a manual process. This is one of the original protocols of the Internet and it was designed as a command line facility. But our modern IT systems need a solution that is hands-off and lights-out. A good Managed FTP solution should allow you to fully automate both inbound and outbound file transfers. And because our IBM i servers are often located inside the firewall, we need to be able to detect and pull files that are available on remote and external servers. We sometimes call this the automatic scan of remote servers and it is a critical automation component. Your Managed FTP solution should allow you to automate every aspect of sending and receiving files, including encryption of files you are sending and decryption of files that you receive.

Scheduling: Many file transfers have to happen at a certain time of day. This means that your Managed FTP solution should provide for intelligent scheduling of file transfers. Scheduled transfers might happen hourly, once a day, once a week, or once a month. But the scheduling facility should accommodate your transfer needs. Additionally, the ability to schedule a transfer through a third party scheduling application should be fully supported.

Application integration: When you receive a file via FTP it should be possible to automatically decrypt the file and automatically process it into your applications. This level of automation removes the need for human intervention and provides data in a timely fashion to your applications and ultimately to your users. Look for your Managed FTP solution to provide callable exit points, library and IFS directory scan facilities, and plenty of sample programs that you can use to start your automation projects.

Diagnostic logging: It is easy to underestimate the importance of built-in diagnostic logging in a Managed FTP solution. When you are processing many files every day, and when you are processing time critical files (think payroll files), you have to be able to identify the cause of a transfer problem very quickly. A diagnostic log should be available for every transfer and should clearly identify the causes of failures. FTP sessions can fail for a wide variety of reasons including network outages, password changes, remote configuration changes, expired certificates and keys, and many other issues. The presence of diagnostic logging means the difference between a long night hunched over a terminal or a leisurely trip to the pub!

Exception handling: A good Managed FTP solution will tell you when things go wrong. From my point of view this is both a good thing AND a bad thing. We have customers who run our solutions for years and forget that they are there! But this is what you want. A Managed FTP solution should tell you when a transfer failed and give you some clues on the resolution. In our Managed FTP solution notifications are done by email and you have a lot of choices – you can get notified on failure, notified on successful transfer, or notified on all activity. But it is the ability to get notified on failure that is so critical.  Exception handling should also include automatically retrying a failed transfer operation. Look for the ability of your Managed FTP solution to retry a transfer at least three times before reporting a problem!

Resource management: We don’t think of FTP as a CPU or disk intensive operation, and that is generally true. But imagine what it might be like to transfer several thousand files a day!  Those small individual file transfers start to add up in terms of resource utilization pretty fast.  Your IBM i Managed FTP solution should allow you to manage job priorities, schedule transfers during off hours of light usage, manage CPU time slice and pool allocations, and many other aspects of resource management.

File system support: As IBM i users we have a lot of data stored in DB2 files and tables. But we also may have a lot of information stored in the Integrated File System (IFS). A Managed FTP solution should support these file systems for both inbound and outbound transfers. Also consider those special file system requirements. Can you manage file transfers in a Windows network shared folder? Or a Linux/Unix NFS mounted volume? Or in a mounted drive for a remote IBM i server through the File400 folder? These can be important features for an IBM i solution.

Commands and APIs: Last but not least, there are always things we can’t do with the ready-to-use features of a Managed FTP solution. We will want to have access to IBM i commands and APIs to help us handle those special situations. In our Alliance FTP Manager solution we give you access to every single FTP operation directly from your RPG and CL applications. You can perform every aspect of an FTP session under program control, and know if it was success or failed, and why. And of course, command interfaces make it easy to put or get a single file. You might not initially miss the rich set of APIs, but the day will come when you need them!

In this blog I’ve tried to give you a feel for the basic set of features that you should find in a Managed FTP solution. You can learn more about our Alliance FTP Manager solution for the IBM i platform here.

Patrick

Secure Managed File Transfer for IBM i

Tags: Managed File Transfer, Secure Managed File Transfer, FTP Manager for IBM i

Related Posts Plugin for WordPress, Blogger...

Who Has Access to My Encryption Keys in Amazon Web Services (AWS)?

Posted by Patrick Townsend

One of the most common questions we get here at Townsend Security is something like “Who has access to my encryption keys in AWS?” It is a natural question to ask and it can be hard to determine the answer to this question with many key management solutions - including the key management services provided by Amazon. Let me try to answer this question for our Alliance Key Manager for AWS.

Protecting Encryption Keys in AWS Alliance Key Manager for AWS runs as a stand-alone EC2 instance in Amazon Web Services. There is no component of Alliance Key Manager that is shared by other users of AWS, and there is no component of Alliance Key Manager that uses encryption key management services provided by Amazon in AWS. Neither Amazon nor Townsend Security hold any credentials that grant access to the key manager solution, and there are no “backdoors” to the key manager. You, the AWS customer, solely and exclusively manage it.

Encryption keys in Alliance Key Manager are managed by the Alliance Key Manager Administrative Console. This is an application that you install on your PC and which accesses one or more instances of Alliance Key Manager in AWS. While you could install the administrative console in an EC2 instance in AWS, we recommend that you install it on a secure PC outside of AWS. You maintain full control over the application used to manage keys.

The administrative console connects to Alliance Key Manager over a secure TLS session using certificates that are issued by the Alliance Key Manager instance. That is, only administrators using PKI certificates known and authenticated by the specific key manager are allowed to perform management functions.

The use of encryption keys by applications or users inside of AWS or outside of AWS is likewise controlled by secure TLS sessions that are also validated to the specific key manager instance and certificate authority. Just having a valid certificate from Verisign or other certificate authority is not adequate to gain access to encryption keys.

An additional layer of encryption key access control allows you to restrict an encryption key to a user or group as defined on the client-side certificate. This level of key access control leverages to Common Name (CN) and Organizational Unit (OU) of the client-side certificate to control access to a key. If you specify that a key can only be accessed by user “Bill” in the group “Sales”, then Alliance Key Manager will inspect the connecting session to be sure that the certificate Common Name contains the value “Bill” and that the certificate Organizational Unit is “Sales”. Access is denied unless this rule is met.

Lastly, if an unauthorized user gains access to the Alliance Key Manager encryption key database they will not have access to the actual encryption keys. Data encryption keys (DEK) are encrypted by key encryption keys (KEK) which are stored separately. A stolen backup or copied key database file will be insufficient to gain access to the encryption keys.

You should be aware that any cloud service provider has low level access to your virtual machines and storage. That is true of Amazon’s cloud platform as it is with any other cloud platform. And you should also be aware that Amazon and other cloud service providers must obey the laws and regulations of the countries in which they operate. You cannot exclude the possibility that Amazon will provide access to your key management EC2 instance if required to do so under the law. In some countries this means that law enforcement organizations, national security agencies, and other governmental actors may have access to your encryption keys. And, while very unlikely, you cannot exclude the chance that an Amazon employee might make an unauthorized access to the EC2 instance of your key server. If these possibilities make you feel uncomfortable you should consider hosting your key management server outside of AWS. Townsend Security's Alliance Key Manager solution can be hosted in your data center or in a hosting facility that you designate for this and provide keys to your AWS applications.

You can find more information about Alliance Key Manager for AWS here.

Meeting Best Practices for Protecting Information in AWS

 

Tags: Alliance Key Manager, Amazon Web Services (AWS), Encryption Key Management

Related Posts Plugin for WordPress, Blogger...

IBM SoftLayer, VMware and Getting to the Secure Cloud

Posted by Patrick Townsend

VMware customers have experienced some frustration around cloud migrations for quite some time. While VMware has attempted to provide a path to the cloud through their vCloud strategy, substantial barriers have made the migration difficult to achieve. While VMware customers have achieved substantial cost and efficiency savings through VMware technologies, they have largely not been able to extend these benefit through cloud migrations. The new IBM and VMware partnership is changing this.

Encrypting Data in IBM SoftLayer There are two aspects of the partnership that make this very different than other cloud offerings for VMware:

  • IBM will create dedicated cloud resources for VMware customers who migrate, essentially creating a private cloud platform. This will alleviate many of the concerns of Enterprise customers about the security of their cloud migration.
  • IBM has negotiated great pricing for VMware applications on the IBM SoftLayer cloud. This will let VMware customers experience an immediate ROI on the cloud migration.

VMware customers have invested a great deal in the architecture and administration of their VMware environments. VMware applications let customers create secure segmented networks, apply security and access controls, monitor the status of their applications, automate business recovery operations, and perform an number of other administrative functions in a coherent and cost-effective manner. Finding a way to the cloud that leverages benefits is crucial for the VMware customer.

I think IBM has found the right path for these customers.

IBM is not the first to attempt to address the needs of the VMware customer. Rackspace and others have cloud migration plans and have been working with VMware customers. But I think IBM SoftLayer has solved some problems that will open the path to VMware in the cloud.

One of the remaining challenges for VMware customers is to get their third-party application and security vendors to embrace this move to the cloud. While the IBM VMware offering is a true native VMware implementation that will reduce the technical issues, software vendors need to be explicit about their support for solutions deployed on a cloud platform. VMware customers can find it confusing and frustrating to get clear statements of support from vendors.

At Townsend Security we are trying to support our customers migrating our products in VMware to the IBM SoftLayer cloud. Our Alliance Key Manager and related encryption products are already validated for deployment in VMware. And we’ve extended our support for the VMware platform by validating our encryption and key management to the PCI Data Security Standard. We now also support the migration or deployment of our solutions in the IBM SoftLayer cloud. This gives IBM SoftLayer customers the confidence of moving their encryption security infrastructure to the IBM SoftLayer platform.

It has been amazing how the PCI-DSS validation of our solutions in VMware have helped VMware customers meet PCI requirements. VMware deserves a lot of credit for creating a formal program for PCI validation on a well-defined VMware reference architecture. Working with Coalfire, a security auditing firm, we were able to very quickly certify our encryption and key management solutions to the PCI-DSS standard. This has helped multiple customers quickly move through the compliance challenge and this applies to the IBM SoftLayer platform, too. Because IBM SoftLayer retains the dedicated resources for the customer, compliance will be easy to achieve.

IBM and VMware have created a great path to the cloud for VMware customers. I know that many challenges will remain for customers with more complex VMware architectures, especially with hybrid environments. But I think the path to the cloud just got a bit straighter and easier.

Patrick

Encrypting

Tags: VMware, Cloud Security

Related Posts Plugin for WordPress, Blogger...

Encryption & Key Management for the IBM i

Posted by Luke Probasco

Excerpt from the eBook "IBM i Encryption with FieldProc - Protecting Data at Rest."


Encryption in FieldProc

It goes without saying that your FieldProc application will need to use an encryption library to perform encryption and decryption operations. IBM provides an encryption software library as a native part of the IBM i operating system. It is available to any customer or vendor who needs to implement encryption and decryption in their FieldProc programs.

IBM i Encryption with FieldProc Unfortunately the native IBM encryption library is very slow. This might not be noticeable when encrypting or decrypting a small amount of data. But batch operations can be negatively impacted. The advent of AES encryption on the Power8 processor has done little to mitigate the performance issue with encryption. IBM i customers and third party vendors of FieldProc solutions should use caution when implementing FieldProc using the native IBM i AES software libraries. They are undoubtedly accurate implementations of AES encryption, but suffer on the performance front.

Key Management

An encryption strategy is only as good as the key management strategy, and it is difficult to get key management right. For companies doing encryption the most common cause of an audit failure is an improper implementation of key management. Here are a few core concepts that govern a good key management strategy:

  • Encryption keys are not stored on the same system as the sensitive data they protect.
  • Security administrators of the key management solution should have no access to the sensitive data, and database administrators should have no access to encryption key management (Separation of Duties).
  • On the IBM i system this means that security administrators such as QSECOFR and any user with All Object (*ALLOBJ) should not have access to data encryption keys or key encryption keys.
  • More than one security administrator should authenticate before accessing and managing keys (Dual Control).
  • All access to encryption keys should be logged and audited. This includes use of encryption keys as well as management of keys.
  • Encryption keys should be mirrored/backed up in real time to match the organization’s standards for system availability.

Encryption Key Caching

Encryption keys are often used frequently when batch operations are performed on sensitive data. It is not unusual that a batch program would need to perform millions or tens of millions of encryption and decryption operations. While the retrieval of an encryption key from the key server may be very efficient, performance may suffer when keys need to be retrieved many times. This can be addressed through encryption key caching in the local environment.

Secure key caching should be performed in separate program modules such as a service program and should not be cached in user programs where they are more subject to discovery and loss. Any module caching an encryption key should have debugging options disabled and visibility removed. Secure key caching is critical for system performance and care should be taken to protect storage.

Encryption Key Rotation

Periodically changing the encryption keys (sometimes called “key rotation” or “key rollover”) is important
to the overall security of your protected data. Both data encryption keys (DEK) and key encryption keys (KEK) should be changed at appropriate intervals. The appropriate interval for changing keys depends on a number of variables including the amount of data the key protects and the sensitivity of that data, as well as other factors. This interval is called the cryptoperiod of the key and is defined by NIST in Special Publication 800-57 “Key Management Best Practices”. For most IBM i customers rotation of data encryption keys should occur once a year and rotation of the key encryption keys should occur no less than once every two years. 

IBM i Encryption with FieldProc

Tags: Encryption, IBM i, FIELDPROC

Related Posts Plugin for WordPress, Blogger...

Encryption & Key Management for SQL Server

Posted by Luke Probasco

Excerpt from the eBook "Encryption & Key Management for Microsoft SQL Server."


Microsoft SQL Server has become a ubiquitous storage mechanism for all types of digital assets. Protecting these data assets in SQL Server is a top priority for business executives, security specialists, and IT professionals.  The loss of sensitive data can be devastating to the organization and in some cases represents a catastrophic loss. There is no alternative to a digital existence and cybercriminals, political activists, and state actors have become more and more adept at stealing this information.  To properly protect this information, businesses are turning to encryption and key management.

Encryption

Encryption and key management for SQL Server Encryption in the broadest sense means obscuring information to make it inaccessible to un- authorized access. But here we will use the term in its more precise and common use – the use of well accepted encryption algorithms based on mathematical proofs and which have been embodied and approved as international standards.

Many approaches to encryption do not meet minimal requirements for security and compliance. Our definition of encryption excludes:

  • Homegrown methods developed by even experienced and talented programmers.
  • Emerging encryption methods that are not yet widely accepted.
  • Encryption methods that are widely accepted as secure, but which have not been adopted by standards organizations.
  • Data substitution and masking methods not based on encryption.

An example of an encryption method that does meet our criteria would include the Advanced Encryption Standard (AES) which is sometimes knows as Rijndael, Triple Data Encryption Standard (3DES), RSA, and Elliptic Curve encryption methods.

In the context of protecting data in a SQL Server data- base, the most common encryption method protecting whole databases or an individual column in a table is AES. All key sizes of AES (128-bit, 192-bit, and 256-bit) are considered secure and are appropriate for protecting digital assets. Many organizations chose 256- bit AES for this purpose due to the larger key size and stronger security.

One major additional benefit of using an industry standard such as AES is that it meets many compliance requirements or recommendations for the use of industry standard encryption. This includes the PCI Data Security Standard (PCI-DSS), HIPAA, FFIEC, and the EU General Data Protection Regulation (EU GDPR).

Key Management

It is not possible to discuss an encryption strategy without discussing the protection of encryption keys. An encryption strategy is only as good as the method used to protect the encryption keys. Encryption algorithms such as AES and Triple DES are public and readily available to any attacker. The protection of the encryption key is the core to the security of the encrypted data. This is why security professionals consider the loss of the encryption key as equivalent to the loss of the digital assets. Once an attacker has the encryption key it is trivial to decrypt and steal the data.

Generating strong encryption keys and protecting them is harder that it might at first appear. The generation of strong encryption keys depends on the use of random number generation schemes, and modern computers do not excel at doing things randomly. Specialized software routines are needed to generate strong encryption keys. Encryption keys must also be securely stored away from the data they protect, and yet must be readily available to users and applications that are authorized to access the sensitive data. Authenticating that a user or application is authorized to an encryption key is a large focus of key management systems.

Over the years standards and best practices have emerged for encryption key management and these have been embodied in specialized security applications called Key Management Systems (KMS), or Enterprise Key Management (EKM) systems. The National Institute of Standards and Technology (NIST) has taken a lead in this area with the creation of Special Publication 800-57 entitled “Recommendation for Key Management”. In addition to this important NIST guidance, the organization publishes the Federal Information Processing Standard (FIPS) 140-2 “Security Requirements for Cryptographic Modules”. To serve the needs of organizations needing independent certification that a key management application meets this standard, NIST provides a validation program for FIPS 140-2 compliant systems. All professional key management systems have been validated to FIPS 140-2.

When protecting sensitive SQL Server data with encryption, look for these core principles of key management:

  • Encryption keys are stored away from the data they protect, usually on specially designed security devices or dedicated virtual servers.
  • Encryption keys are managed by individuals who do not have access to the data stored in the SQL Server database (Separation of Duties).
  • Encryption key management requires more than one security administrator to authenticate before performing any critical work on keys (Dual control).
  • Key retrieval requests from users and applications are authenticated using industry standard methods.
  • Encryption management and key usage are logged in real time and logs are stored on secure log collection servers.
  • Encryption key management systems have been validated to FIPS 140-2 and the Key Management Interoperability Protocol (KMIP).

These are just a few of the core requirements for deploying a professional key management solution to protect your SQL Server data.

Encryption and key management for SQL Server

 

Tags: Encryption, SQL Server

Related Posts Plugin for WordPress, Blogger...

When Encrypting Databases, Does Key Connection for SQL Server Cache the Encryption Key?

Posted by Patrick Townsend

Customers who need to encrypt data in Microsoft SQL Server databases know that they must protect the encryption key with appropriate controls to meet compliance regulations and to achieve safe harbor in the event of a data breach. Townsend Security's Alliance Key Manager solution provides the Extensible Key Management (EKM) software to make proper key management a breeze. Called Key Connection for SQL Server, this EKM Provider software is installed on the server hosting the SQL Server database and it talks seamlessly to one or more Alliance Key Manager servers running in a separate server instance. Customers get proper key management that meets compliance regulations such as PCI-DSS in an easy-to-deploy solution.

Encryption-Key-Management-SQL-Server Performance is always a consideration when it comes to enabling encryption, so customers naturally ask us about key caching. Does Key Connection for SQL Server cache the encryption keys to enable better performance?

The short answer is Yes, it does.

How it does key caching depends on whether you use Transparent Data Encryption (TDE) or Cell Level Encryption (CLE). Let’s drill into each of these cases.

Transparent Data Encryption (TDE)
The implementation of TDE by Microsoft involves encrypting the entire table space and the database logs. It is the easiest type of encryption to deploy as it requires no changes to the actual application that uses the SQL Server database. You can implement TDE encryption by installing the Key Connection For SQL Server software and issuing four commands through the SQL Server management console. Restart logging to insure that it is encrypted and you are done.

So with TDE, how are keys managed? The TDE architecture involves SQL Server generating a symmetric key (usually a 256-bit AES key) and then asking Alliance Key Manager to encrypt it with an RSA key. This encrypted symmetric key is then stored on the server that hosts the SQL Server database. When you start SQL Server (or restart it, as the case may be) the SQL Server instance asks Alliance Key Manager to use RSA decryption to decrypt the symmetric key. Once that is complete the SQL Server instance has the key it needs and no longer needs to communicate with Alliance Key Manager. There is no need for key caching and the key will be decrypted the next time that SQL Server starts.

Cell Level Encryption (CLE)
The implementation of CLE by Microsoft SQL Server is quite different than for TDE. The EKM Provider software is still responsible for managing the symmetric encryption key, but it is accomplished in a different way. You must make small changes to your application SQL statements to request encryption and decryption of the cell contents. When CLE is activated the Key Connection for SQL Server software is called for each column and row that needs to be encrypted or decrypted. This means a lot more calls to the EKM Provider software and this is where key caching is very important.

The Key Connection for SQL Server software in this case does cache the symmetric encryption key (usually a 256-bit AES key) in order to improve performance. The key is cached using an equally strong RSA key to prevent key capture by malware. When SQL Server calls the Townsend Security EKM provider the software retrieves the key from the key server and will cache it locally for a 24 hour period. For the next 24 hours all subsequent requests for encryption or decryption are satisfied locally without the need to retrieve the key again. After 24 hours, the key is discarded and a fresh key is retrieved from the key server. If the connection to the key server is not available error messages are written to the Windows Event Log, but encryption processes will continue using the locally cached key, once the 24 hour period expires, network connectivity will need to be restored for a fresh key to be retrieved and operations restored. With key caching database encryption, performance is much better.

The architecture of the Alliance Key Manager EKM provider implements other core features needed to help protect your database. These include:

  • Separation of Duties between Key Administrators and Database Administrators
  • Dual Control for key management operations
  • Built-in logging to the Windows Event Manager
  • High availability failover to one or more secondary key servers
  • Automatic recovery of failed EKM Provider services
  • Security of credentials through Windows Certificate Store
  • Easy key rollover using native SQL Server commands

Key caching is important for performance, but this is just one part of an overall key management strategy for Microsoft SQL Server.

As customers move to virtualized and cloud environments, Alliance Key Manager and the Key Connection for SQL Server EKM Provider software will move with you. In addition to traditional IT data centers, all Townsend Security encryption and key management solutions run in VMware (vSphere, ESXi, etc.), Microsoft Azure, Amazon Web Services, and in any cloud service provider vCloud environment.

Encryption and Key Management for Microsoft SQL Server

Tags: Alliance Key Manager, SQL Server

Related Posts Plugin for WordPress, Blogger...