Townsend Security Data Privacy Blog

Luke Probasco

Recent Posts

Townsend Security Announces Major Update to Alliance LogAgent for IBM i

Posted by Luke Probasco on Nov 29, 2016 12:01:00 AM

New features include full reporting of administrative users, including authority the user adopts through Group Profiles and Supplemental Group Profiles.

IBM i Security: Event Logging & Active Monitoring Townsend Security today announced a significant update to its existing Alliance LogAgent for IBM i (AS/400, iSeries) solution, allowing full reporting of administrative users, which includes authority the user adopts through Group Profiles and Supplemental Group Profiles. Alliance LogAgent is a Security Information and Event Management (SIEM) integration solution that collects, formats, and transmits security information in real-time to any SIEM or log collection server.

When the new configuration options are enabled, Alliance LogAgent will tag all significant security events as performed by the administrative level user. This enhancement will help security administrators easily identify which users have elevated privileges and enable SIEM solutions to quickly identify and alert on operations. In addition to the new administrative user reporting, Alliance LogAgent now provides an easy-to-use local assessment report that identifies privileged users. This will reduce the overhead of inspecting and adjusting privileges of IBM i users. 

Alliance LogAgent is compatible with all SIEM solutions that accept Syslog messages, IBM QRadar Log Event Extended Format (LEEF), or the HP ArcSight Common Event Format (CEF). The new administrative field reporting will make it easy for SIEM administrators to create dashboards, compliance reports, and alerts based on reported fields. When an administrator privileges are detected Alliance LogAgent adds the following field to the security message:

            admin_user=yes

For IBM QRadar the new field is:

            adminUser=yes

By providing a normalized field in the security events sent to the SIEM monitoring platform, the SIEM’s query and forensic tools can be used more effectively.

“Many IBM i customers have struggled with identifying who on their system has elevated privileges. It is crucial to identify and strictly control these users as cyber criminals often use privilege escalation to enable the exfiltration of sensitive data,” said Patrick Townsend, CEO of Townsend Security. “On first look an IBM i account may appear to have normal user privileges, but may in fact inherit higher privileges through a Group Profile or Supplemental Group Profile. Alliance LogAgent now detects these elevated privileges in real time, and provides the security administrator with an easy-to-use report to identify the source of elevated privileges. We think this is a crucial enhancement that will help IBM i customers better secure their platforms.”

Alliance LogAgent is in use with a wide variety of SIEM solutions including LogRhythm, SecureWorks, NTT Solutionary, IBM QRadar, Alert Logic, AlienVault, McAfee SIEM, Splunk, SolarWinds, and many others. In addition to collecting the IBM i security audit journal information Alliance LogAgent collects system history messages, operator messages, exit point information, system statistics, and a variety of open source application logs in Unix/Linux format.

The solution is licensed on a per logical partition (LPAR) basis, with perpetual and subscription licensing options available. Existing Alliance LogAgent customers on a current maintenance contract can upgrade to the new version at no charge.

IBM i

Topics: Alliance LogAgent, Press Release

IBM i FieldProc Architecture and Implementation

Posted by Luke Probasco on Oct 19, 2016 10:33:47 AM

Excerpt from the eBook "IBM i Encryption with FieldProc - Protecting Data at Rest."


FieldProc Architecture

IBM i Encryption with FieldProc FieldProc is a type of column-level exit point that is implemented directly in the DB2 database. As is typical with any of the other IBM exit points, IBM provides the architecture for the exit point to invoke a user application, but IBM does not provide that application. Customers or vendors can create a FieldProc application based on the documented architecture of the exit point. Townsend Security is one vendor who provides such software.

A note on terminology:
We will use the term “column” when referring to a field in a file, and we will use the term “table”
to refer to a physical file. For the purposes of discussing FieldProc these terms are interchangeable and we will use the more modern terms “column” and “table”. In a similar way we will use the term “index” to refer to a column that is either a primary or secondary key. A secondary key would include a key de ned in a logical file.

The exit point architecture is very simple. There are only two commands and three functions that are supported. The two commands are:

  • Start FieldProc
  • End FieldProc

The three functions that are handled by a FieldProc program are:

  • Initialization
  • Encode (Encrypt)
  • Decode (Decrypt)

When FieldProc is started on a column the FieldProc program is called for Initialization, and then called for each row in the table to provide for the encryption of the column. When FieldProc is ended the FieldProc program is called for each row to decrypt the data. All other normal read, change, and insert operations call the FieldProc program to provide for encryption or decryption as needed. FieldProc is not invoked for a delete operation on a row.

FieldProc Implementation

FieldProc is a SQL feature and is implemented through SQL commands. That is, FieldProc is not implemented through DDS, but only through SQL. Through SQL you create or alter a table to have the FIELDPROC attribute like this:

ALTER TABLE employee alter column
salary set FieldProc Userpgm

Likewise you can remove a FieldProc attribute using a SQL DROP statement like this:

ALTER TABLE employee alter column
salary Drop FieldProc

As noted above when you start FieldProc with the CREATE TABLE or ALTER TABLE commands, the effect is immediate. Starting FieldProc causes the FieldProc program to be called for each row in the table to perform encryption. Dropping the FieldProc attribute of a column causes the FieldProc program to be called for each row to decrypt the data.

Certain operations will cause FieldProc to be invoked even if the column data is not being used. For example, a legacy RPG program might read a file from beginning to end but not use a particular column that is under FieldProc control. Even though the column is not used in the RPG program FieldProc will be invoked to decrypt the value of the column for each row.

Note that a native SQL SELECT statement that does not include the encrypted column will NOT invoke FieldProc for decryption. Some work management operations are difficult with the current IBM implementation of FieldProc. For example, save and restore operations when the restore library is different than the save library can be problematic. Additionally, change management operations where there are changes to column attributes can be difficult to manage and require decrypting the database, applying the change, and then re-encrypting the database. These limitations are especially true in legacy RPG applications.

Supported Field (Column) Types

Most column types are support for FieldProc including character, numeric, date, time, and timestamp  fields. You can use FieldProc on null-capable  fields and double-byte character  fields. Some type of derived and counter  fields do not support FieldProc, but IBM i customers will  find that all normal application  fields are supported. With FieldProc there is no need to change the  eld size or attribute, nor is there any need to change or re-compile applications.

Legacy DDS Files

Many IBM i customers have the impression that legacy DDS  files will not support FieldProc encryption. This is not true! You can readily implement FieldProc encryption on  files created with DDS and it is not necessary to convert them to DDL and SQL tables. As IBM notes there are some advantages to doing so, but it is not necessary and the large majority of FieldProc users continue to use DDS  files. As noted above, you can only start and stop FieldProc using SQL commands, but these SQL commands work  ne on DDS-created  files.

Encrypting Multiple Fields in One File

DB2 FieldProc control can be started on multiple columns in one database table. Three is no practical limit on the number of columns you can select for FieldProc implementation. Of course, there are performance implications for encrypting multiple columns as the FieldProc program will be called independently for each column under FieldProc control. But many IBM i customers place multiple columns in a table under FieldProc encryption control. In some cases customers place hundreds of columns under FieldProc control. FieldProc’s support for multiple columns includes columns that are Primary or Secondary keys to the data. Please see the following sections for a discussion of limitations related to encrypted indexes and legacy RPG applications.

IBM i Encryption with FieldProc

Topics: IBM i, FIELDPROC

Encrypted Indexes - Overcoming Limitation with FieldProc Encryption on the IBM i

Posted by Luke Probasco on Aug 30, 2016 11:00:00 AM

Excerpt from the eBook "IBM i Encryption with FieldProc - Protecting Data at Rest."


Encrypted Indexes

IBM i Encryption with FieldProc The DB2 FieldProc implementation fully supports the encryption of columns which are indexes (keys) to the data in a native SQL context - this includes RPGSQL applications. However, there are some severe limitations with legacy RPG applications around encrypted index order. If you are like most IBM i users it is important to understand these limitations to implement DB2 encryption in RPG.


Legacy RPG applications use record-oriented operations and not set-oriented operations that are typical of SQL. Many record oriented operations on encrypted indexes in RPG will work as expected. For example, a CHAIN operation on an encrypted index to retrieve a record from a DB2 table will work. If the record exists it will be retrieved and FieldProc will decrypt the value. However, many range and data ordering operations will not work as expected with legacy RPG programs leaving you with empty reports and subfiles. Consider the following logic:

SETLL (position to an particular location in the index for a file)
WHILE (some condition)
READ (Read the next record by the index) END WHILE

The SETLL (Set lower limit) operation will probably work if the particular index value exists. However, the program logic will then read the next records based on that position in the file. IBM i developers are surprised to learn that they will be reading the next record based on the ENCRYPTED value, and not on the decrypted value which is what they might expect. The result is often empty subfiles and printed reports. This is very common logic in applications where indexes are encrypted. Note that your RPG program will not get a file I/O error, it just won’t produce the results you expect.

In simpler applications this side-effect of encrypted indexes is not significant, or easily programmed around. However, in some applications where sensitive data is encrypted across a large number of tables (think social security number in banking applications) this can be a significant limitation.

Don’t despair, keep reading.

Overcoming Encrypted Index Limitations in RPG

The limitations of encrypted indexes in legacy RPG applications often lead IBM i customers to abandon their encryption projects. The prospect of converting a large number of legacy RPG applications to newer SQL interfaces can be daunting. Their legacy RPG applications contain a lot of valuable business logic, and the effort to make the conversion can be quite large.

Wouldn’t it be great if you could wave a magic wand and make legacy RPG applications use SQL without any changes?

IBM opened a path to this type of solution with Open Access for RPG, or OAR. OAR allows for the substitution of traditional file I/O operations with user- written “Handlers”. In essence, you can replace the native file I/O operations of RPG with your own code. No change to program file handling or business logic! The OAR capability spawned a number of user interface modernization products, and other solutions that take advantage of this. Imagine if your RPG screen handling I/O with Execute Format (EXFMT) could be replaced with a web-based GUI library. Instant modernization of the UI! There are now many solutions that leverage OAR for UI modernization.

Join Logical Files with DDS

One limitation of logical files created with DDS specifications involves join logical files. You will not be able to create DDS join logical file where the join involves an encrypted  eld with FieldProc. You will get an error about invalid data type usage. This is an IBM limitation and there is no known workaround for this issue. Note that this limitation only applies to DDS join logical  les and does not apply to SQL joins using encrypted indexes. Most IBM i customers will need to change their RPG or COBOL program logic to avoid the use of DDS join logical  les which use encrypted indexes.

Application Changes

Legacy RPG applications that use encrypted indexes often need re-design and re-programming to avoid the problems of encrypted indexes. You can avoid these issues if you are using an Open Access for RPG (OAR) solution that maps the legacy RPG record-based file operations to native SQL and the SQL Query Engine (See the note above about Townsend Security’s OAR/SQL offering).

If you need to re-design your RPG applications to avoid encrypted indexes consider putting all of your sensitive data in a table that is indexed by some non-sensitive value such as a sequence number or customer number, and use FieldProc to encrypt that table. There are many approaches to application re- design and you should be able to find a method that works for you.

Change Management

IBM i customers who have deployed change management solutions can encounter challenges with FieldProc implementations. While most of these challenges are surmountable, it will take effort on the part of the systems management team to integrate FieldProc controls into their change management strategy. Unfortunately the implementation of FieldProc does not provide much in the way of support for change management systems. The activation of FieldProc changes an attribute on the column in the table, but change management systems generally are not aware of this attribute. It can be challenging to promote table and column level changes and properly retain the FieldProc attribute of the data. Look for a FieldProc implementation that provides a command-level interface to stop, start, and change FieldProc definitions. These commands can help you integrate FieldProc encryption into your change management strategy.

IBM i Encryption with FieldProc

Topics: IBM i, FIELDPROC

IBM i, PCI DSS 3.2, and Multi-Factor Authentication

Posted by Luke Probasco on Aug 16, 2016 10:48:05 AM

With the recent update to the Payment Card Industry Data Security Standard (PCI DSS) regarding multi-factor authentication (also known as Two Factor Authentication or 2FA), IBM i administrators are finding themselves faced with the requirement of deploying an authentication solution within their cardholder data environments (CDE). Prior to version 3.2 of PCI DSS, remote users were required to use two factor authentication for access to all systems processing, transmitting, or storing credit card data. With version 3.2 this is now extended to include ALL local users performing administrative functions in the CDE.

Here is an excerpt from section 8.3: (emphasis added)

8.3 Secure all individual non-console administrative access and all remote access to the cardholder data environment (CDE) using multi-factor authentication.

IBM i, PCI DSS, & Multi-Factor Authentication I recently was able to sit down with Patrick Townsend, Founder & CEO of Townsend Security, and talk with him about PCI DSS 3.2, what it means for IBM i users, and what IBM i users can do to meet the latest version of PCI DSS.

Thanks for taking some time to sit down with me, Patrick. Can you recap the new PCI-DSS version 3.2 multi-factor authentication requirement? This new requirement seems to be generating a lot of concern.

Well, I think the biggest change in PCI DSS 3.2 is the requirement for multi-factor authentication for all administrators in the cardholder data environment (CDE).  Prior to 3.2, remote users like contractors and third party administrators, had to use multi-factor authentication to login to the network.  This update extends the requirement of multi-factor authentication for ALL local, non-console users.  We are seeing businesses deploy multi-factor authentication at the following levels:

  •      Network Level - When you first access the network
  •      System Level – When you access a server or any system with the CDE
  •      Application Level – Within your payment application

The requirement for expanded multi-factor authentication is a big change and is going to be disruptive for many merchants and processors to implement.

Yeah, sounds like this is going to be a big challenge.  What does this mean for your IBM i customers?

There are definitely some aspects of this PCI-DSS update that will be a bigger challenge on the IBM i compared to Windows or Linux.  First, we tend to run more applications on the IBM i.  In a Windows or Linux environment you might have one application per server.  On the IBM i platform, it is not uncommon to run dozens of applications.  What this means is, you have more users who have administrative privileges to authenticate – on average there can be 60 or more who can sometimes be a challenge to identify!  When merchants and processors look at their IBM i platforms, they will be surprised at the number of administrators they will discover.

Additionally, the IBM i typically has many network services exposed (FTP, ODBC, Operations Navigator, etc).  The challenge of identifying all the entry points is greater for an IBM i customer.

You say it is harder to identify an administrative user, why is that?

On the IBM i platform, there are some really easy and some really difficult administrative users to identify.  For example, it is really easy to find users with QSECOFR (similar to a Windows Administrator or Linux Root User) privileges.  But it starts to get a little more difficult when you need to identify users, for example, who have all object (*ALLOBJ) authority.  These users have almost the equivalent authority of QSECOFR.  Finding, identifying, and properly inventorying users as administrators can be a challenge.

Additionally, with a user profile, there is the notion of a group profile.  It is possible for a standard user, who may not be an administrator, to have an administrative group profile.  To make it even more complex, there are supplemental groups that can also adopt elevated authority.  Just pause for a minute and think of the complex nature of user profiles and how people implement them on the IBM i platform.  And don’t forget, you may have users on your system who are not highly privileged directly through their user profile, but may be performing administrative tasks related to the CDE.  Identifying everyone with administrative access is a big challenge.

Townsend Security has a multi-factor authentication solution for the IBM i.  How are you helping customers deal with identifying administrators?

From the beginning, we realized this would be a problem and we have taken some additional steps, specifically related to PCI DSS 3.2 to help IBM i customers identify administrators.  We made it possible to build a list of all users who have administrative access and then require them to use multi-factor authentication when logging on.  We have done a lot to help the IBM i security administrator identify highly privileged users and enroll them into a two factor authentication solution, and then periodically monitor/update/audit the list.

What are some of the other multi-factor authentication challenges that IBM i customers face?

Some of them are pretty obvious.  If you don’t have a multi-factor authentication solution in place, there is the effort of evaluating and deploying something on the IBM i server.  You’ll find users who may already have a multi-factor authentication solution in place for their Windows or Linux environments, but haven’t extended it to their IBM i.  Even if they aren’t processing credit card numbers on the IBM i, if it is in the CDE, it still falls within the scope of PCI DSS.

Aside from deploying a solution, there is going to be administrative work involved.  For example, managing the new software, developing new procedures, and putting governance around multi-factor authentication.  Further, if you adopt a hardware-based solution with key FOBs, you have to have processes in place to distribute and replace them, as well as manage the back-end hardware.  It has been really great seeing organizations move to mobile-based authentication solutions based on SMS text messages where there isn’t any hardware of FOBs to manage.  Townsend Security’s Alliance Two Factor Authentication went that route.

Let’s get back to PCI DSS.  As they have done in the past, they announced the updated requirements, but businesses still have a period of time to get compliant.  When does PCI DSS 3.2 actually go into effect?

The PCI SSC always gives merchants and processors time to implement new requirements.  The actual deadline to meet compliance is February 1, 2018.  However, what we are finding is that most merchants are moving rapidly to adopt the requirements now.  When an organization has an upcoming audit or Self Assessment Questionnaire (SAQ) scheduled, they generally will want to meet the compliance requirements for PCI DSS 3.2.  It never is a good idea to wait until the last minute to try and deploy new technology in order to meet compliance.

You mentioned earlier that you have a multi-factor authentication solution.  Tell me a little bit about it.

Sure. Alliance Two Factor Authentication is a mature, cost-effective solution that delivers PINs to your mobile device (or voice phone), rather than through an expensive key FOB system. IBM i customers can significantly improve the security of their IBM i systems through implementation of proven two factor authentication.  Our solution is based on a non-hardware, non-disruptive approach.  Additionally, we audit every successful or failed authentication attempt and log it in the security audit journal (QAUDJRN).  One thing that IBM i customers might also be interested in, is in some cases, we can even extend their existing multi-factor authentication solution to the IBM i with Alliance Two Factor Authentication.  Then they can benefit from the auditing and services that we supply for the IBM i platform.  Our goal was to create a solution that was very cost-effective, rapid to deploy, meets compliance regulations, and doesn’t tip over the IT budget.

Download a podcast of my complete conversation here and learn more about what PCI DSS 3.2 means for IBM i administrators, how to identify administrative users, challenges IBM I customers are facing, and how Townsend Security is helping organizations meet PCI DSS 3.2.

Podcast: IBM i, PCI DSS, and Multi-Factor Authentication

 

Topics: 2FA, IBM i, PCI DSS

Encryption & Key Management for the IBM i

Posted by Luke Probasco on Jul 25, 2016 10:38:04 AM

Excerpt from the eBook "IBM i Encryption with FieldProc - Protecting Data at Rest."


Encryption in FieldProc

It goes without saying that your FieldProc application will need to use an encryption library to perform encryption and decryption operations. IBM provides an encryption software library as a native part of the IBM i operating system. It is available to any customer or vendor who needs to implement encryption and decryption in their FieldProc programs.

IBM i Encryption with FieldProc Unfortunately the native IBM encryption library is very slow. This might not be noticeable when encrypting or decrypting a small amount of data. But batch operations can be negatively impacted. The advent of AES encryption on the Power8 processor has done little to mitigate the performance issue with encryption. IBM i customers and third party vendors of FieldProc solutions should use caution when implementing FieldProc using the native IBM i AES software libraries. They are undoubtedly accurate implementations of AES encryption, but suffer on the performance front.

Key Management

An encryption strategy is only as good as the key management strategy, and it is difficult to get key management right. For companies doing encryption the most common cause of an audit failure is an improper implementation of key management. Here are a few core concepts that govern a good key management strategy:

  • Encryption keys are not stored on the same system as the sensitive data they protect.
  • Security administrators of the key management solution should have no access to the sensitive data, and database administrators should have no access to encryption key management (Separation of Duties).
  • On the IBM i system this means that security administrators such as QSECOFR and any user with All Object (*ALLOBJ) should not have access to data encryption keys or key encryption keys.
  • More than one security administrator should authenticate before accessing and managing keys (Dual Control).
  • All access to encryption keys should be logged and audited. This includes use of encryption keys as well as management of keys.
  • Encryption keys should be mirrored/backed up in real time to match the organization’s standards for system availability.

Encryption Key Caching

Encryption keys are often used frequently when batch operations are performed on sensitive data. It is not unusual that a batch program would need to perform millions or tens of millions of encryption and decryption operations. While the retrieval of an encryption key from the key server may be very efficient, performance may suffer when keys need to be retrieved many times. This can be addressed through encryption key caching in the local environment.

Secure key caching should be performed in separate program modules such as a service program and should not be cached in user programs where they are more subject to discovery and loss. Any module caching an encryption key should have debugging options disabled and visibility removed. Secure key caching is critical for system performance and care should be taken to protect storage.

Encryption Key Rotation

Periodically changing the encryption keys (sometimes called “key rotation” or “key rollover”) is important
to the overall security of your protected data. Both data encryption keys (DEK) and key encryption keys (KEK) should be changed at appropriate intervals. The appropriate interval for changing keys depends on a number of variables including the amount of data the key protects and the sensitivity of that data, as well as other factors. This interval is called the cryptoperiod of the key and is defined by NIST in Special Publication 800-57 “Key Management Best Practices”. For most IBM i customers rotation of data encryption keys should occur once a year and rotation of the key encryption keys should occur no less than once every two years. 

IBM i Encryption with FieldProc

Topics: Encryption, IBM i, FIELDPROC

Encryption & Key Management for SQL Server

Posted by Luke Probasco on Jul 22, 2016 3:27:11 PM

Excerpt from the eBook "Encryption & Key Management for Microsoft SQL Server."


Microsoft SQL Server has become a ubiquitous storage mechanism for all types of digital assets. Protecting these data assets in SQL Server is a top priority for business executives, security specialists, and IT professionals.  The loss of sensitive data can be devastating to the organization and in some cases represents a catastrophic loss. There is no alternative to a digital existence and cybercriminals, political activists, and state actors have become more and more adept at stealing this information.  To properly protect this information, businesses are turning to encryption and key management.

Encryption

Encryption and key management for SQL Server Encryption in the broadest sense means obscuring information to make it inaccessible to un- authorized access. But here we will use the term in its more precise and common use – the use of well accepted encryption algorithms based on mathematical proofs and which have been embodied and approved as international standards.

Many approaches to encryption do not meet minimal requirements for security and compliance. Our definition of encryption excludes:

  • Homegrown methods developed by even experienced and talented programmers.
  • Emerging encryption methods that are not yet widely accepted.
  • Encryption methods that are widely accepted as secure, but which have not been adopted by standards organizations.
  • Data substitution and masking methods not based on encryption.

An example of an encryption method that does meet our criteria would include the Advanced Encryption Standard (AES) which is sometimes knows as Rijndael, Triple Data Encryption Standard (3DES), RSA, and Elliptic Curve encryption methods.

In the context of protecting data in a SQL Server data- base, the most common encryption method protecting whole databases or an individual column in a table is AES. All key sizes of AES (128-bit, 192-bit, and 256-bit) are considered secure and are appropriate for protecting digital assets. Many organizations chose 256- bit AES for this purpose due to the larger key size and stronger security.

One major additional benefit of using an industry standard such as AES is that it meets many compliance requirements or recommendations for the use of industry standard encryption. This includes the PCI Data Security Standard (PCI-DSS), HIPAA, FFIEC, and the EU General Data Protection Regulation (EU GDPR).

Key Management

It is not possible to discuss an encryption strategy without discussing the protection of encryption keys. An encryption strategy is only as good as the method used to protect the encryption keys. Encryption algorithms such as AES and Triple DES are public and readily available to any attacker. The protection of the encryption key is the core to the security of the encrypted data. This is why security professionals consider the loss of the encryption key as equivalent to the loss of the digital assets. Once an attacker has the encryption key it is trivial to decrypt and steal the data.

Generating strong encryption keys and protecting them is harder that it might at first appear. The generation of strong encryption keys depends on the use of random number generation schemes, and modern computers do not excel at doing things randomly. Specialized software routines are needed to generate strong encryption keys. Encryption keys must also be securely stored away from the data they protect, and yet must be readily available to users and applications that are authorized to access the sensitive data. Authenticating that a user or application is authorized to an encryption key is a large focus of key management systems.

Over the years standards and best practices have emerged for encryption key management and these have been embodied in specialized security applications called Key Management Systems (KMS), or Enterprise Key Management (EKM) systems. The National Institute of Standards and Technology (NIST) has taken a lead in this area with the creation of Special Publication 800-57 entitled “Recommendation for Key Management”. In addition to this important NIST guidance, the organization publishes the Federal Information Processing Standard (FIPS) 140-2 “Security Requirements for Cryptographic Modules”. To serve the needs of organizations needing independent certification that a key management application meets this standard, NIST provides a validation program for FIPS 140-2 compliant systems. All professional key management systems have been validated to FIPS 140-2.

When protecting sensitive SQL Server data with encryption, look for these core principles of key management:

  • Encryption keys are stored away from the data they protect, usually on specially designed security devices or dedicated virtual servers.
  • Encryption keys are managed by individuals who do not have access to the data stored in the SQL Server database (Separation of Duties).
  • Encryption key management requires more than one security administrator to authenticate before performing any critical work on keys (Dual control).
  • Key retrieval requests from users and applications are authenticated using industry standard methods.
  • Encryption management and key usage are logged in real time and logs are stored on secure log collection servers.
  • Encryption key management systems have been validated to FIPS 140-2 and the Key Management Interoperability Protocol (KMIP).

These are just a few of the core requirements for deploying a professional key management solution to protect your SQL Server data.

Encryption and key management for SQL Server

 

Topics: Encryption, SQL Server

Encryption Performance on the IBM i

Posted by Luke Probasco on Jul 12, 2016 11:11:00 AM

Excerpt from the eBook "IBM i Encryption with FieldProc - Protecting Data at Rest."


The performance of an encryption solution is one of the biggest concerns that an IBM i customer has when implementing FieldProc. There are many factors that can affect performance of a FieldProc application and it is wise to pay special attention to performance as you prepare to implement a solution. Let’s look at several factors that can affect performance.

AES 128-bit and AES 256-bit Performance

IBM i Encryption with FieldProc All key sizes for AES encryption (128-bit, 192-bit, and 256-bit) are considered secure for protecting all commercial sensitive data. Customers naturally wonder if the smaller key size of 128-bit encryption means better encryption performance when compared to 256-bit AES keys. In fact, the performance difference is much smaller than one might imagine. The smaller 128-bit key size uses 10 rounds during AES encryption, but the 256-bit AES keys use 14 rounds during AES encryption. The IBM i RISC processor optimizes cryptographic operations so the performance penalty for 256-bit AES encryption is very small. Considering that cybersecurity guidelines now recommend the use of 256-bit AES for protecting data in long-term storage, and for protecting against advances in quantum computing, you should always consider using 256-bit AES encryption for protecting IBM i sensitive data.

AES Encryption Software Performance

Encryption software libraries can vary greatly in performance even when using exactly the same methods and key sizes. Unfortunately the native IBM i AES encryption software library and APIs have a low performance profile and this can have a negative impact on your IBM i applications. Fortunately, there are high performance AES encryption libraries available from third parties that have a much better performance profile. The difference between the native IBM encryption libraries and third party libraries can be more than 100 times in processing speed. Use care when deploying an encryption solution to insure that the performance meets your minimum needs for processing time.

POWER8 On-Chip AES Encryption Performance

The new Power8 processor from IBM provides a hardware implementation of AES encryption to improve the performance of encryption. All new models of IBM i servers built with the POWER8 processor include this hardware implementation of AES encryption. In concept this is similar to the Intel processors which provide AES encryption through the AES-NI implementation.

While encryption performance has improved with the POWER8 processor, it is not a large improvement. Encryption speeds using the native IBM i APIs are about double the speed of the previous versions of the Power processor, but still greatly lag in performance compared to third party encryption libraries. It should be noted that this performance lag disappears when the IBM APIs are used to encryption very large blocks of information. For example, the AES implementation on the POWER8 processor can encrypt a megabyte of information at incredible speeds. Unfortunately this does not apply to the smaller blocks of data (credit card numbers, social security numbers, etc.) that are typically stored in DB2 database tables. It is likely that IBM i customers using iASP encryption will see a major performance bene t with the new POWER8 systems.

FieldProgram Program Performance

When FieldProc invokes a program to perform encryption or decryption it makes a dynamic all to a program executable. There is always a certain amount of overhead when making a dynamic call to an external program and this is true in FieldProc context, too. The more columns in a table that are under FieldProc control the more dynamic calls you will have to the FieldProc program as the FieldProc program is invoked independently for each column. It is important that the FieldProc program be properly optimized
for performance. Some key performance measures include:

  • Minimized file I/O operations by the FieldProc program.
  • Program modules are optimized on compilation.
  • The program executable is optimized on compilation.
  • Memory management is optimized for performance.
  • Visibility of program objects is removed to improve performance.
  • The FieldProc program is optimized for multi- threaded operation.
  • Audit logging is optimized for performance. 

It is important that the FieldProc application be properly optimized as it may be invoked many times during a typical interactive or batch request.

FieldProc for Multiple Columns in a Table

It is likely that you will need to protect multiple columns in a table. For example, in a medical setting a table might contain the following information for a patient:

  • Name
  • Date of birth
  • Address
  • Social security number
  • Email address
  • Etc.
In this case you will be protecting multiple columns in a single table. This is fully supported by FieldProc and is very common in FieldProc implementations. When FieldProc programs are properly optimized for performance you should find that the extra performance impacts per column are not excessive. Thanks to the re-entrant model of the IBM i operating system the protection of multiple columns has a smaller performance impact. Encrypting two columns is NOT twice the performance impact as encryption one column.

Key Management Performance Impacts

As mentioned above your FieldProc encryption solution should implement good encryption key management practices. It is important that the key management interface impose little performance penalty. This means that the FieldProc application should use intelligent and secure key caching to minimize the number of key retrieval operations that must be performed. Additionally keys should not be exposed in user applications, but should be protected in separate modules. When implemented properly good key management will impose extremely little performance impact.   

Audit Logging Performance

One common feature of FieldProc applications is the ability to collect audit information about user activity. This might include collecting information about which users accessed decrypted information, etc. Audit logging will always extract a performance price. If you need to collect audit logs of FieldProc activity be sure to measure the performance impact of audit log collection in your own environment.

IBM i Encryption with FieldProc

Topics: Encryption, FIELDPROC

Key Management: The Hardest Part of Encryption

Posted by Luke Probasco on Jun 3, 2016 7:19:00 AM

Excerpt from the eBook "2016 Encryption Key Management: Industry Perspectives and Trends." 


Encryption Key Management Industry Perspectives and Trends eBook While organizations are now committed to implementing encryption, they are still struggling with getting encryption key management right. With all major operating systems, cloud platforms, and virtualization products now supporting encryption, it is relatively easy to make the decision to activate encryption to protect sensitive data. But an encryption strategy is only as good as the method used to protect encryption keys. Most audit failures for customers already using encryption involve the improper storage and protection of encryption keys.

Ignorance and fear are the driving reasons for this core security failure. Many IT professionals are still not versed in best practices for encryption key management, and IT managers fear that the loss of encryption keys or the failure of access to a key manager will render their data unusable. This leads to insecure storage of encryption keys in application code, unprotected files on the data server, and poor protection of locally stored keys.

Most encryption key management solutions have evolved over the last decade to provide unparalleled reliability and redundancy. This has largely removed the risk of key loss in critical business databases and applications. But the concern persists and inhibits the adoption of defensible key management strategies.

Take Aways

  • Protect encryption keys with single-purpose key management security solutions.
  • Never store encryption keys on the same server that houses sensitive data.
  • Only deploy encryption key management solutions that are based on FIPS 140-2 compliant technology.
  • Only deploy encryption key management solutions that implement the KMIP industry standard for interoperability.
  • Avoid cloud service provider key management services where key management and key custody are not fully under your control.

Cloud Migration and Key Management Challenges

Cloud migration continues to a be a high priority for organizations large and small. The benefits for migrating to the cloud are clear. Reduction in cost for computing power and storage, leverage of converged infrastructure, reduction of IT administrative costs, on-demand
scalability, and many other benefits will continue the rapid migration to cloud platforms. As cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, Google App and Compute Engine, and IBM SoftLayer mature we can expect the pace of cloud migration to accelerate.

While cloud service providers are providing some encryption key management capabilities, this area will continue to be a challenge. The question of who has control of the encryption keys (key custody) and the shared resources of multi-tenant cloud service providers will continue to be headaches for organizations migrating to the cloud. The ability to exclusively manage encryption keys and deny access to the cloud service provider or any other third-party will be crucial to a good cloud key management strategy and end-customer trust. The attempt by governments and law enforcement agencies to access encrypted data through access to encryption keys will make this issue far more difficult moving forward.

Unfortunately most cloud service providers have not adopted common industry standards for encryption key management. This results in the inability of customers to easily migrate from one cloud platform to another resulting in cloud service provider lock-in. Given the rapid evolution of cloud computing and the infancy of cloud computing, customers will have to work hard to avoid this lock-in, especially in the area of encryption key management. This is unlikely to change in the near future.

Take Aways

  • Avoid hardware-only encryption key management solutions prior to cloud migration. Make sure your key management vendor has a clear strategy for cloud migration.
  • Ensure that your encryption key management solution runs natively in cloud, virtual and hardware platforms.
  • Ensure that your encryption key management solution provides you with exclusive management of and access to encryption keys. Neither your cloud service provider nor your encryption key management vendor should have administrative or management access to keys. Backdoor access through common keys or key management code is unacceptable.
  • Avoid cloud service provider lock-in to proprietary key management services. The cloud is still in its infancy and retaining your ability to choose and migrate between cloud platforms is important.
New Call-to-action

Topics: Encryption, Encryption Key Management

Speaking Words of Wisdom, Let it Key

Posted by Luke Probasco on May 27, 2016 8:57:00 AM

"This article was originally posted on Drop Guard’s blog. Drop Guard automates Drupal updates to ensure security of your site within minutes after a security release."


What Data Needs To Be Encrypted In Drupal?

Encryption has gone mainstream. Thanks to the numerous data breaches (781 during 2015 in the U.S. alone) data security is a top priority for businesses of all sizes. Semi-vague language like “we ensure our data is protected” from IT teams is no longer good enough to satisfy the concerns of business executives and their customers. CEOs are losing their jobs and companies are suffering financial losses/fines that reach into the millions of dollars when poorly encrypted or un-encrypted data is lost.

Fortunately, the recent 2016 Global Encryption Trends Study shows that there is a growing shift with businesses developing an encryption strategy – Germany (61%) and the U.S.(45%) are in the lead with the primary drivers including:

  • To comply with external or data security regulations
  • To protect enterprise Intellectual Property (IP)
  • To protect information against specific identified threats
  • To protect customer personally identifiable information (PII)

Before getting too much deeper, let’s first clarify a common misperception. Yes, while the big brands are in the headlines, they aren’t the primary target of hackers. A recent threat report from Symantec found that three out of every five cyber-attacks target small and midsize companies. These businesses are targeted because they have weaker security controls and have not taken a defense-in-depth approach to security. Even though some of these businesses may be encrypting their data, chances are they are not doing it correctly.

How do you then define what properly encrypting data involves? First, it means using an industry standard encryption algorithm such as the Advanced Encryption Standard (AES), alternatively known as Rijndael. Standards are important in the encryption world. Standard encryption algorithms receive the full scrutiny of the professional cryptographic community. Further, many compliance regulations such as PCI-DSS, HIPAA/HITECH, FISMA, and others are clear that only encryption based on industry standards meet minimal regulatory requirements. Standards bodies such as NIST, ISO, and ANSI have published standards for a variety of encryption methods including AES.

But wait, my web sites don’t collect credit cards or social security numbers. Why should I care about encryption?

You would be surprised at what can be considered Personally Identifiable Information and should be encrypted. PII now includes information such as (and not limited to):

  • Email address
  • Password
  • Login name
  • Date of birth
  • Address
  • IP address

For a moment, pause and consider how many websites you have built that collect this type of information. As you will quickly realize, even the most basic marketing websites that collect a name and email address should deploy encryption.

Key Management
The second part of properly encrypting data involves key management and understanding the important role that it plays. Your encryption key is all that stands between your data and those that want to read it. So then, what good is a locked door if the key is hidden underneath the “Welcome” mat?

Security best practices and compliance regulations say that encryption keys should be stored and managed in a different location than the encrypted data. This can present a challenge for Drupal developers as the standard has been to store it on the server outside the web root. While there is a suite of great encryption modules (Encrypt, Field Encryption, Encrypted Files, etc.), they unfortunately all stash the key in similar unsecure locations – in the Drupal database, settings file, or in a protected file on the server. Once a hacker compromises a site, they can then take the encryption keys and have access to all the sensitive data.

“Let it Key”
Let’s now bring this back to one of my favorite songs by the Beatles to paraphrase a line (excuse the pun), “Speaking words of wisdom, let it key.” The Key module provides the ability to manage all keys on a site from a singular admin interface. It can be employed by other modules (like the aforementioned encryption modules) to offload key management so the module is not responsible for protecting the key. It gives site administrators the ability to define how and where keys are stored, which, when paired with the proper key storage modules, allows the option of a high level of security and the ability to meet security best practices and compliance requirements. Drupal developers can now easily integrate enterprise level key management systems such as Townsend Security’s Alliance Key Manager or Cellar Door Media’s Lockr (both of these companies have been heavy sponsors of encryption module development).

Enterprise level key managers do more than just store keys outside of the Drupal installation. In addition to safeguarding encryption keys, they manage them through the entire lifecycle—from creation to destruction. Further, a key manager will allow for:

  • Key naming and versioning
  • Key change and rotation
  • Secure key retrieval
  • Key mirroring
  • Key import and export
  • Password and passphrase protection
  • User and group control for key access 

In addition to providing NIST-compliant AES encryption and FIPS 140-2 compliant key management, an encryption key manager also allows administrators to store their API keys outside of Drupal. Private API keys are frequently used within Drupal by services like Authorize.net, PayPal, MailChimp, etc., and all these modules store their keys in the database in the clear. If your site gets hacked, so does access to the services that you have integrated into your site. For example, if your Amazon S3 API key were in your stolen database, hackers would have access to your entire offsite S3 storage. Consider how detrimental it would be for a client to find out that a hacker has gained access to their PayPal account - after all, they were using PayPal because they didn’t want to deal with the security risks and liability of hosting their own payment processing.

By routing all keys within Drupal to a central key management location, it is now possible to have total control over all encryption keys, API keys, external logins and any other value needing to remain secret. This gives developers and site builders a useful dashboard to quickly see the status of every key, and control where it is stored.

Summary
By now, I have hopefully conveyed the importance of encryption and key management. Everyone from developers to site owners need to understand the importance of data security and the proper steps they can take to keep their site, and their businesses safe. More secure encryption also has an effect on the Drupal platform as a whole. Whether Drupal is being used to create the most basic brochure site or advanced enterprise web application, encryption and API key management are critical to ongoing success. Many businesses don’t care whether their web site is built with Drupal, WordPress, Joomla, or even Microsoft Word (yes this is still possible) as long as it is secure. By integrating the proper security controls, including encryption and key management, Drupal will continue to proliferate and be the tool of choice for enterprises looking for the highest level of data and site security.

What Data Needs Encrypted In Drupal?

Topics: Key Management, Drupal

Encryption Key Management & Your IT Strategy

Posted by Luke Probasco on May 24, 2016 7:25:00 AM

Excerpt from the eBook "2016 Encryption Key Management: Industry Perspectives and Trends." 


Virtualization Will Continue to Dominate IT Strategy & Infrastructure

Encryption Key Management Industry Perspectives and Trends eBook Large and small enterprises will continue to grow their virtualization footprints at the same time that they are looking to migrate applications to the cloud. The cost reductions provided by the market leader VMware will ensure that the VMware customer base will continue to consolidate applications and servers on their virtualization technology and that they will continue to be a powerful player in the IT infrastructure space for many years.

While VMware is the dominant technology provider for virtualization, we will see Microsoft attempt to increase their footprint with Hyper-V, and OpenStack solutions will also expand. We expect that all of the virtualization solution providers will attempt to de ne a clear path to the cloud based on their technologies. VMware is already moving in this direction with their vCloud Air initiative, and Microsoft uses Hyper-V as the foundation for the Azure cloud.

Encryption key management solutions that only run in hardware, or that only run on cloud platforms, present substantial obstacles for businesses with virtualized data centers. The rich set of management and security tools are not able to work effectively with solutions that are outside the virtualization boundary. Customers are looking to reduce their hardware footprint, not increase it. And solutions that can’t be managed or secured in the usual way represent additional risk and cost. Encryption key management solutions should be able to run within the virtualization boundary as an approved security application. Key management vendors vary greatly in their ability to support the range of deployments from traditional IT data center, to virtualized plat- forms, to the cloud. Organizations will continue to struggle with key management across these environments.

Take Aways

  • Encryption key management solutions should be able to run as fully native virtual machines in a VMware or Hyper-V environment.
  • Encryption key management solutions should be compatible with security and management functions of the virtual platform.
  • To maintain maximum business flexibility, deploy a key management solution that works well in virtual, cloud, and traditional hardware platforms.
  • Look for key management solutions that carry industry security certifications such as PCI Data Security Standard (PCI DSS), etc.

Key Management Vendor Stability Loses Ground

Merger and acquisitions in the security community continue at a rapid pace. Encryption key management vendors are being absorbed into larger organizations and this trend will likely continue. The public relations around such mergers and acquisitions is always accompanied with glowing prognostications and happy talk. Unfortunately, as often happens with any merger, key management vendors may experience disruption in their organizations as a result of a merger or acquisition. A key management solution may not be strategically important to an acquirer and this can result in disinvestment in the solution negatively impacting customer support. Key management is a part of an organization’s critical infrastructure and these changes can be disruptive.

Organizations can work to minimize the potential impact of key management vendor consolidation by understanding the vendor’s organizational structure, corporate history, and financial basis. Venture backed organizations can be expected to experience an exit through a merger, acquisition, or public offering. Vendors with solutions that are not strategically important to their product mix can also experience change and disruption. Using care in key management vendor selection may be one of the most important efforts you can make. This will be a continuing challenge in the years ahead.

Take Aways

  • Understand your key management vendor’s equity foundation and the likelihood of a merger or acquisition. If the key management vendor is largely funded by venture capital it is almost certain that the company will experience a merger or acquisition event.
  • Understand your key management vendor’s management team. Have key employees been with the company for a longer period of time? This is one good indicator of organizational stability.

Vendor Customer Support is a Growing Concern

As mentioned previously, encryption key management vendors continue to be absorbed into larger organizations and this trend will likely continue. Unfortunately, as can happen with any merger, key management vendors may experience disruption in their organizations as a result of a merger or acquisition. This can directly a effect the customer support organization and your ability to get timely and reliable technical support for your encryption key management solution. Deteriorating customer support can put your organization at risk. Key management solutions are a part of your critical infrastructure and proper customer support is crucial to operational resilience.

Another side affect of reduced or under-funded customer support is the inability of your organization to expand and invest in new applications and systems. These impacts on customer support may not present short-term problems, but can impair long-term resilience and growth flexibility. Many organizations will continue to experience inadequate customer support from key management vendors.

Take Aways

  • Understand the customer support organization of your key management vendor. Does the vendor demonstrate a strong investment in customer support? Is there adequate management of the customer support team? 
  • Review the Service Level Agreement (SLA) provided by your key management vendor. Be sure you understand the expected response times provided by the vendor customer support team. 
  • How do other organizations experience customer support from your key management vendor? Be sure to talk to reference accounts who use the key management product and who have interact- ed with the vendor’s customer support team.
New Call-to-action

Topics: Encryption, Key Management, cloud