Townsend Security Data Privacy Blog

New York, Financial Institutions, and IBM i Encryption

Posted by Patrick Townsend on Oct 7, 2016 8:49:18 AM

New York State has finally had enough. Most people think that their bank has strong encryption in place to protect them from cyber criminals and nation-state actors. But that is too often just not the case. After a number of high profile data breaches at larger financial institutions, and a worrying increase in attacks on banks, the State of New York is about to require that banks implement some stringent security controls to protect Non-Public Information (NPI). The  new regulations are not a difficult read and you can find the proposed new regulations here.

Let’s take a look at some of the new requirements.

IBM i Encryption with FieldProc First, banks are required to develop a cyber security program. This means that there must be formal security processes in place, and appropriate management oversight. Banks are required to have systems and processes to identify threats, respond to them, and recover from an attack. The regulation also requires appropriate reporting of security events, something that is not always honored in today’s environment.

Next, banks are required to have a Cybersecurity Policy that addresses a number of areas related to information security, incident response, access controls, data privacy and other areas. If you are familiar with the Center for Internet Security Critical Security Controls or the NIST Cybersecurity Framework, these points will be familiar to you. As you might expect, a premium is placed on ongoing adaptive responses to new threats, so these frameworks are not a check-box response to the problem. Banks are being asked to step up to a new level of seriousness around IT security.

You also need a Chief Information Security Officer (CISO) position designated in your organization. And, interestingly, the CISO is required to report to the board of directors of the bank. I find this requirement interesting as it means that boards of directors cannot remain ignorant about the state of the bank’s IT security. And this means that it will have to become a part of the governance process of the board.

The new regulations now get down into the weeds about what you should be doing. Here are some of the areas:

  • Vulnerability and Penetration testing
  • Collect and archive system logs
  • Implement restricted access controls
  • Conduct risk assessments at least annually
  • Hire cybersecurity professionals and make sure they stay current on new threats
  • Require third parties to adhere to the same security rules as the bank
  • Use multi-factor authentication for internal and external access to NPI
  • Train and monitor all personnel with access to NPI (SIEM, anyone?)
  • Encrypt NPI
  • Develop and incident response plan

The requirement to deploy encryption is surprisingly strong. It is a mandate unless deploying encryption is unfeasible. Given the wide availability of encryption in all major operating systems and databases, I believe it will be difficult for a CISO to argue against the use of encryption. Clearly the regulators want banks to encrypt sensitive NPI data.

Who does the new rule affect?

The new rule affects any financial institution that is regulated by the New York Department of Financial Services. This will affect all national and global banking organizations as New York is one of the leading centers of global finance. It may also affect regional banks through the extension of the rules to affiliates. Regional and local banks should review their relationships with larger banks to try to understand their requirements under these new laws.

When does the new rules take effect?

The new rules take effect on January 1, 2017. In terms of time, that is very soon! And there is only a 180 day transition period. CISOs can request an extension, but these controls must be in place by January of 2018. That is a very aggressive timeframe!

How is the IBM i (AS/400, iSeries) affected?

Almost all large banks and financial institutions use the IBM i server somewhere in their organizations. The IBM i server is perfect for the back office applications that banks run and which need a very high level of availability. Many of the largest third party banking applications are deployed on IBM i servers.

But banks are going to face a big challenge with IBM i DB2 database encryption. The Non-Public Information that must be protected is often used for indexes in the DB2 database. While IBM DB2 supports column level encryption, it won’t work well RPG programs that use encrypted indexes. Here at Townsend Security we have a fix for this problem! By implementing an OAR-based SQL interface for RPG files we are removing the impediment that banks face with encryption on the IBM i server. You can read more here.

And you can get a quick look at how this helps in this short video: 

Patrick

Topics: Compliance, IBM i

System Logging for the IBM i - A Must Have Security Control

Posted by Patrick Townsend on Oct 5, 2016 7:49:48 AM

IBM i (AS/400, iSeries) customers now know that their systems are no more secure than any other. That is, they are just as much a target as any Windows or Linux server that exist in less secure network environments. This is not a criticism of the IBM i security implementation, it just reflects the fact that our IBM i servers are connected to a wide range of other devices including user PCs, network devices, local and WAN networks, and so forth. The attack surface is broad, IBM i servers are a rich target for data thieves, and we are all just one user PC infection away from an IBM i server data breach.

IBM i Security: Event Logging & Active Monitoring A lot of IBM i customers postpone deploying a system logging and SIEM solution to actively monitor for security threats, even though this is one of the most effective means of detecting and preventing a data breach. There is a reason that the Center for Internet Security recommends system log collection and monitoring with a SIEM solution as the 6th Critical Security Control. It just works really well when done properly.

I suspect that cost is one of the reasons that projects are postponed. This reasoning is “penny wise and pound foolish”. Here’s why:

  • There are now very cost effective log collection and SIEM monitoring solutions available for the smaller Enterprise. Some of these solutions are cloud-based and some can be deployed in VMware infrastructure. Cost is not as big an issue as it has been.
  • The failure to collect system logs means that it will be difficult to do the forensic investigation that you must do after a data breach. Imagine a costly data breach and then imagine not being able to figure out how the data thieves entered your system! All too often this means that a second and third data breach happens after the first. Without collecting system logs you won’t be able to deploy the forensics tools to trace the path of the cyber criminal through your organization. Another data breach is almost a certainty.
  • Early detection is one of the most effective means of preventing the data breach. This means large savings on outside security consultants, litigation costs, and the deployment of log collection and monitoring tools at inconvenient times. Early detection is a life-saver for those who can prevent a breach.
  • Log collection and SIEM monitoring solutions are easier to deploy than ever. While deploying an active monitoring system was complex and costly in the past, many SIEM solutions can be deployed and start working to protect your organization is just a matter of hours. Unlike in years past when SIEM solutions took weeks to install, configure and deploy, many are now extremely easy to setup and administer.

From a Governance, Risk Management and Compliance point of view deploying the CIS Critical Security Controls is a minimum requirement. Here is what the California Department of Justice said in this year’s data breach report under “Recommendations”:

“The 20 controls in the Center for Internet Security’s Critical Security Controls identify a minimum level of information security that all organizations that collect or maintain personal information should meet. The failure to implement all the Controls that apply to an organization’s environment constitutes a lack of reasonable security.”

Every senior executive in the modern organization understands that canceling the fire insurance policy on their headquarters building would be a bad idea. You would be one fire away from the end of your career. The same now holds true for IT security. Shareholders and the board of directors now understand that the failure to use reasonable measures to protect data assets is a fundamental failure of governance. Don’t be the executive who has to try to defend that failure!

Patrick

IBM i

Topics: System Logging, IBM i

The IBM i, OpenSSL and Security Patching

Posted by Patrick Townsend on Oct 4, 2016 1:31:14 PM

alert.pngOpenSSL is one of the most widely used cryptographic applications and libraries. A new group of security vulnerabilities have just been identified for OpenSSL and they need attention right away. IBM i (AS/400, iSeries) customers often feel isolated and protected from the security issues with OpenSSL, but this attitude can lead to trouble. IBM i customers need to monitor for vulnerabilities in OpenSSL and patch their IBM and third-party applications.

IBM uses the OpenSSL application in the Hardware Management Console, or HMC. The HMC is a highly privileged application that can manage many instances of IBM i logical partitions in a customer environment. You should carefully monitor security patches from IBM and apply them to your HMC. You can find IBM’s security alerts and notifications here.

On the IBM i platform itself you should be aware that the PASE environment includes the OpenSSL application. The PASE environment is an AIX emulation environment that is usually present on IBM i customer systems. Since PASE is an IBM licensed program you should apply Program Temporary Fixes (PTFs) to resolve issues with OpenSSL in PASE.

The IBM i no-charge OpenSSH licensed program also contains the OpenSSH application. However, the OpenSSH application does not use OpenSSL for session security. The primary use of OpenSSL in the OpenSSH application is for cryptographic functions. While this reduces the security threat, it does not eliminate it. You should update OpenSSH when they are PTF patches available.

The OpenSSL application is also found in the Linux-like environment of the QSHELL application in later versions of the IBM i operating system. If you use the Start QShell (STRQSH) command you can use the “type openssl” command to determine where it is located. You can view the version of OpenSSL with this command: “openssl version”. As in the above examples, be sure you apply PTFs when there are security notices for OpenSSL.

Lastly there are a number of third-party applications that embed the OpenSSL application. If these third-party applications are not using the IBM-provided OpenSSL libraries, you will need to contact those third party software providers to receive an update. Be sure you carefully review any third party application, open source application or free tool for the presence of OpenSSL. Any application that performs FTP data transfers, implements an inbound or outbound web service, or  communicates with an encryption key management server should be reviewed (see below for a statement about IBM i solutions from Townsend Security).

Here at Townsend Security we have several IBM i security applications that perform secure encrypted TLS connections. None of these applications use the OpenSSL application. Instead we use the native IBM i GSKit library for TLS session security which is configured using the IBM no-charge licensed Digital Certificate Manager (DCM) product. Our Alliance FTP Manager, Alliance AES/400 FieldProc encryption, Alliance Token Manager, Alliance Two Factor Authentication, and Alliance XML/400 solutions are not subject to the OpenSSL security vulnerabilities. Note that our Alliance FTP Manager solution uses the IBM OpenSSH application, so as mentioned above be sure to apply those OpenSSH security updates from IBM if you are using OpenSSH for transfers.

In addition to signing up for IBM i security alerts at the above web site, you can subscribe to US-CERT notifications here.

Hat tip to Alex Woodie for his article on IBM i OpenSSL vulnerabilities. You can find it here.

Patrick

IBM i Encryption with FieldProc

Topics: IBM i

Creating the IBM Security Audit Journal QAUDJRN

Posted by Patrick Townsend on Sep 28, 2016 8:49:33 AM

Excerpt from the eBook "IBM i Security: Event Logging & Active Monitoring - A Step By Step Guide."


IBM i Security: Event Logging & Active Monitoring

The QAUDJRN security audit journal does not exist when you first install a new IBM i server. You must create the journal receivers and the journal to start the process of security event collection. As is the case with any journal on the IBM i server you must first create a journal receiver and then create the QAUDJRN journal and associate it with the journal receiver.

The first step is to create a journal receiver which will be the actual container of security events. You can create the journal receiver in a user library or in the IBM general-purpose library QGPL. Creating a library to hold the journal receivers is recommended as this allows more flexible system management. You should use a sequence number in the last four or ve characters of the journal receiver name. This allows the IBM i operating system to automatically create new receivers. You can use this command to create the journal receiver.

   CRTJRNRCV JRNRCV(MYLIB/AUDRCV0001)
   THRESHOLD(1000000) AUT(*EXCLUDE)
   TEXT(’Auditing Journal Receiver’)

Now that we have created the first journal receiver, we can create the QAUDJRN journal. The QAUDJRN journal is always created in the operating system library QSYS. You can use this command to create the journal and associate it with the first journal receiver:

CRTJRN JRN(QSYS/QAUDJRN)
JRNRCV(MYLIB/AUDRCV0001)
MNGRCV(*SYSTEM) DLTRCV(*NO)
RCVSIZOPT(*RMVINTENT *MAXOPT3)
AUT(*EXCLUDE) TEXT(’Auditing Journal’)
IBM i

Topics: System Logging, IBM i

IBM i Security: Event Logging & Active Monitoring

Posted by Patrick Townsend on Sep 20, 2016 9:40:55 AM

Excerpt from the eBook "IBM i Security: Event Logging & Active Monitoring - A Step By Step Guide."


Active monitoring and log collection are at the top of the list of effective security controls. IBM i (AS/400, iSeries) users have to solve some special challenges to implement this critical security control. The IBM i server is a rich source of security information. Unlike many Windows and Linux server deployments the IBM i platform can host a complex mix of back-office applications, web applications, and open source applications and services. For convenience we will address these various log sources in terms of primary sources and secondary sources. Primary sources should be given priority in your implementation processes, as the most critical events are likely to be found in primary event sources.

Primary Sources

IBM i Security: Event Logging & Active Monitoring IBM Security Audit Journal QAUDJRN
When configured properly the IBM i server records a wide variety of security events in the security journal QAUDJRN. Events such as password failures, network authentication failures, failed SSL and TLS network negotiations, application and database authority failures, and many others go straight into the security audit journal QAUDJRN. This would naturally be the first focus of your event monitoring strategy. We discuss the setup and configuration of the QAUDJRN journal below.

System History File QHST
The system history message file QHST (actually a collection of files that start with the name QHST) is also a repository of important security information. All job start, job termination, and abnormal job termination events are recorded in the QHST files. Additionally, important operational messages are written to QHST files and should be collected and monitored.

Exit Points
The IBM i operating system provides a number of “hooks” for network and other services on the IBM i platform. These hooks are referred to as exit points. IBM i customers or vendors can write applications that collect information and register them for the exit points. Once registered an exit point monitoring application can collect information on a wide variety of activities. These include information on inbound and outbound FTP sessions, TCP sockets activity, network authentication, administrative commands through Operations navigator, and many other core operating system services. A good active monitoring strategy should collect information from exit points and pass the information to the centralized log collection server.

Secondary Sources

Web Applications
Most IBM i customers do not deploy the server as a primary web server, but selected web applications and services are often incorporated into a customer’s primary web deployment. The IBM i supports a rich set of web server platforms including IBM Websphere, PHP, Perl, Python, and traditional HTML/CGI services. When web applications are deployed on an IBM i server they are a target for traditional attacks. Web logs should be included in your event collection and monitoring strategy.

Open Source Applications
For some years IBM has been bringing a wide variety of open source applications to the IBM i platform. This includes applications like OpenSSH with Secure Copy (sCP), Secure FTP (sFT) and remote command shell. Many other application frameworks have
been added like Perl, PHP, and Python. All of these applications provide entry points and potential points of compromise for attackers. If you deploy these types of solutions you should enabled and collect their logs.

System and Applications Messages in QSYSMSG AND QSYSOPR
Many customer and vendor applications write important operational and security messages to the system operator message queue QSYSOPR and to the optional system message queue QSYSMSG.

IBM i

 

Topics: System Logging, IBM i

Encrypted Indexes - Overcoming Limitation with FieldProc Encryption on the IBM i

Posted by Luke Probasco on Aug 30, 2016 11:00:00 AM

Excerpt from the eBook "IBM i Encryption with FieldProc - Protecting Data at Rest."


Encrypted Indexes

IBM i Encryption with FieldProc The DB2 FieldProc implementation fully supports the encryption of columns which are indexes (keys) to the data in a native SQL context - this includes RPGSQL applications. However, there are some severe limitations with legacy RPG applications around encrypted index order. If you are like most IBM i users it is important to understand these limitations to implement DB2 encryption in RPG.


Legacy RPG applications use record-oriented operations and not set-oriented operations that are typical of SQL. Many record oriented operations on encrypted indexes in RPG will work as expected. For example, a CHAIN operation on an encrypted index to retrieve a record from a DB2 table will work. If the record exists it will be retrieved and FieldProc will decrypt the value. However, many range and data ordering operations will not work as expected with legacy RPG programs leaving you with empty reports and subfiles. Consider the following logic:

SETLL (position to an particular location in the index for a file)
WHILE (some condition)
READ (Read the next record by the index) END WHILE

The SETLL (Set lower limit) operation will probably work if the particular index value exists. However, the program logic will then read the next records based on that position in the file. IBM i developers are surprised to learn that they will be reading the next record based on the ENCRYPTED value, and not on the decrypted value which is what they might expect. The result is often empty subfiles and printed reports. This is very common logic in applications where indexes are encrypted. Note that your RPG program will not get a file I/O error, it just won’t produce the results you expect.

In simpler applications this side-effect of encrypted indexes is not significant, or easily programmed around. However, in some applications where sensitive data is encrypted across a large number of tables (think social security number in banking applications) this can be a significant limitation.

Don’t despair, keep reading.

Overcoming Encrypted Index Limitations in RPG

The limitations of encrypted indexes in legacy RPG applications often lead IBM i customers to abandon their encryption projects. The prospect of converting a large number of legacy RPG applications to newer SQL interfaces can be daunting. Their legacy RPG applications contain a lot of valuable business logic, and the effort to make the conversion can be quite large.

Wouldn’t it be great if you could wave a magic wand and make legacy RPG applications use SQL without any changes?

IBM opened a path to this type of solution with Open Access for RPG, or OAR. OAR allows for the substitution of traditional file I/O operations with user- written “Handlers”. In essence, you can replace the native file I/O operations of RPG with your own code. No change to program file handling or business logic! The OAR capability spawned a number of user interface modernization products, and other solutions that take advantage of this. Imagine if your RPG screen handling I/O with Execute Format (EXFMT) could be replaced with a web-based GUI library. Instant modernization of the UI! There are now many solutions that leverage OAR for UI modernization.

Join Logical Files with DDS

One limitation of logical files created with DDS specifications involves join logical files. You will not be able to create DDS join logical file where the join involves an encrypted  eld with FieldProc. You will get an error about invalid data type usage. This is an IBM limitation and there is no known workaround for this issue. Note that this limitation only applies to DDS join logical  les and does not apply to SQL joins using encrypted indexes. Most IBM i customers will need to change their RPG or COBOL program logic to avoid the use of DDS join logical  les which use encrypted indexes.

Application Changes

Legacy RPG applications that use encrypted indexes often need re-design and re-programming to avoid the problems of encrypted indexes. You can avoid these issues if you are using an Open Access for RPG (OAR) solution that maps the legacy RPG record-based file operations to native SQL and the SQL Query Engine (See the note above about Townsend Security’s OAR/SQL offering).

If you need to re-design your RPG applications to avoid encrypted indexes consider putting all of your sensitive data in a table that is indexed by some non-sensitive value such as a sequence number or customer number, and use FieldProc to encrypt that table. There are many approaches to application re- design and you should be able to find a method that works for you.

Change Management

IBM i customers who have deployed change management solutions can encounter challenges with FieldProc implementations. While most of these challenges are surmountable, it will take effort on the part of the systems management team to integrate FieldProc controls into their change management strategy. Unfortunately the implementation of FieldProc does not provide much in the way of support for change management systems. The activation of FieldProc changes an attribute on the column in the table, but change management systems generally are not aware of this attribute. It can be challenging to promote table and column level changes and properly retain the FieldProc attribute of the data. Look for a FieldProc implementation that provides a command-level interface to stop, start, and change FieldProc definitions. These commands can help you integrate FieldProc encryption into your change management strategy.

IBM i Encryption with FieldProc

Topics: IBM i, FIELDPROC

IBM i, PCI DSS 3.2, and Multi-Factor Authentication

Posted by Luke Probasco on Aug 16, 2016 10:48:05 AM

With the recent update to the Payment Card Industry Data Security Standard (PCI DSS) regarding multi-factor authentication (also known as Two Factor Authentication or 2FA), IBM i administrators are finding themselves faced with the requirement of deploying an authentication solution within their cardholder data environments (CDE). Prior to version 3.2 of PCI DSS, remote users were required to use two factor authentication for access to all systems processing, transmitting, or storing credit card data. With version 3.2 this is now extended to include ALL local users performing administrative functions in the CDE.

Here is an excerpt from section 8.3: (emphasis added)

8.3 Secure all individual non-console administrative access and all remote access to the cardholder data environment (CDE) using multi-factor authentication.

IBM i, PCI DSS, & Multi-Factor Authentication I recently was able to sit down with Patrick Townsend, Founder & CEO of Townsend Security, and talk with him about PCI DSS 3.2, what it means for IBM i users, and what IBM i users can do to meet the latest version of PCI DSS.

Thanks for taking some time to sit down with me, Patrick. Can you recap the new PCI-DSS version 3.2 multi-factor authentication requirement? This new requirement seems to be generating a lot of concern.

Well, I think the biggest change in PCI DSS 3.2 is the requirement for multi-factor authentication for all administrators in the cardholder data environment (CDE).  Prior to 3.2, remote users like contractors and third party administrators, had to use multi-factor authentication to login to the network.  This update extends the requirement of multi-factor authentication for ALL local, non-console users.  We are seeing businesses deploy multi-factor authentication at the following levels:

  •      Network Level - When you first access the network
  •      System Level – When you access a server or any system with the CDE
  •      Application Level – Within your payment application

The requirement for expanded multi-factor authentication is a big change and is going to be disruptive for many merchants and processors to implement.

Yeah, sounds like this is going to be a big challenge.  What does this mean for your IBM i customers?

There are definitely some aspects of this PCI-DSS update that will be a bigger challenge on the IBM i compared to Windows or Linux.  First, we tend to run more applications on the IBM i.  In a Windows or Linux environment you might have one application per server.  On the IBM i platform, it is not uncommon to run dozens of applications.  What this means is, you have more users who have administrative privileges to authenticate – on average there can be 60 or more who can sometimes be a challenge to identify!  When merchants and processors look at their IBM i platforms, they will be surprised at the number of administrators they will discover.

Additionally, the IBM i typically has many network services exposed (FTP, ODBC, Operations Navigator, etc).  The challenge of identifying all the entry points is greater for an IBM i customer.

You say it is harder to identify an administrative user, why is that?

On the IBM i platform, there are some really easy and some really difficult administrative users to identify.  For example, it is really easy to find users with QSECOFR (similar to a Windows Administrator or Linux Root User) privileges.  But it starts to get a little more difficult when you need to identify users, for example, who have all object (*ALLOBJ) authority.  These users have almost the equivalent authority of QSECOFR.  Finding, identifying, and properly inventorying users as administrators can be a challenge.

Additionally, with a user profile, there is the notion of a group profile.  It is possible for a standard user, who may not be an administrator, to have an administrative group profile.  To make it even more complex, there are supplemental groups that can also adopt elevated authority.  Just pause for a minute and think of the complex nature of user profiles and how people implement them on the IBM i platform.  And don’t forget, you may have users on your system who are not highly privileged directly through their user profile, but may be performing administrative tasks related to the CDE.  Identifying everyone with administrative access is a big challenge.

Townsend Security has a multi-factor authentication solution for the IBM i.  How are you helping customers deal with identifying administrators?

From the beginning, we realized this would be a problem and we have taken some additional steps, specifically related to PCI DSS 3.2 to help IBM i customers identify administrators.  We made it possible to build a list of all users who have administrative access and then require them to use multi-factor authentication when logging on.  We have done a lot to help the IBM i security administrator identify highly privileged users and enroll them into a two factor authentication solution, and then periodically monitor/update/audit the list.

What are some of the other multi-factor authentication challenges that IBM i customers face?

Some of them are pretty obvious.  If you don’t have a multi-factor authentication solution in place, there is the effort of evaluating and deploying something on the IBM i server.  You’ll find users who may already have a multi-factor authentication solution in place for their Windows or Linux environments, but haven’t extended it to their IBM i.  Even if they aren’t processing credit card numbers on the IBM i, if it is in the CDE, it still falls within the scope of PCI DSS.

Aside from deploying a solution, there is going to be administrative work involved.  For example, managing the new software, developing new procedures, and putting governance around multi-factor authentication.  Further, if you adopt a hardware-based solution with key FOBs, you have to have processes in place to distribute and replace them, as well as manage the back-end hardware.  It has been really great seeing organizations move to mobile-based authentication solutions based on SMS text messages where there isn’t any hardware of FOBs to manage.  Townsend Security’s Alliance Two Factor Authentication went that route.

Let’s get back to PCI DSS.  As they have done in the past, they announced the updated requirements, but businesses still have a period of time to get compliant.  When does PCI DSS 3.2 actually go into effect?

The PCI SSC always gives merchants and processors time to implement new requirements.  The actual deadline to meet compliance is February 1, 2018.  However, what we are finding is that most merchants are moving rapidly to adopt the requirements now.  When an organization has an upcoming audit or Self Assessment Questionnaire (SAQ) scheduled, they generally will want to meet the compliance requirements for PCI DSS 3.2.  It never is a good idea to wait until the last minute to try and deploy new technology in order to meet compliance.

You mentioned earlier that you have a multi-factor authentication solution.  Tell me a little bit about it.

Sure. Alliance Two Factor Authentication is a mature, cost-effective solution that delivers PINs to your mobile device (or voice phone), rather than through an expensive key FOB system. IBM i customers can significantly improve the security of their IBM i systems through implementation of proven two factor authentication.  Our solution is based on a non-hardware, non-disruptive approach.  Additionally, we audit every successful or failed authentication attempt and log it in the security audit journal (QAUDJRN).  One thing that IBM i customers might also be interested in, is in some cases, we can even extend their existing multi-factor authentication solution to the IBM i with Alliance Two Factor Authentication.  Then they can benefit from the auditing and services that we supply for the IBM i platform.  Our goal was to create a solution that was very cost-effective, rapid to deploy, meets compliance regulations, and doesn’t tip over the IT budget.

Download a podcast of my complete conversation here and learn more about what PCI DSS 3.2 means for IBM i administrators, how to identify administrative users, challenges IBM I customers are facing, and how Townsend Security is helping organizations meet PCI DSS 3.2.

Podcast: IBM i, PCI DSS, and Multi-Factor Authentication

 

Topics: 2FA, IBM i, PCI DSS

Encryption & Key Management for the IBM i

Posted by Luke Probasco on Jul 25, 2016 10:38:04 AM

Excerpt from the eBook "IBM i Encryption with FieldProc - Protecting Data at Rest."


Encryption in FieldProc

It goes without saying that your FieldProc application will need to use an encryption library to perform encryption and decryption operations. IBM provides an encryption software library as a native part of the IBM i operating system. It is available to any customer or vendor who needs to implement encryption and decryption in their FieldProc programs.

IBM i Encryption with FieldProc Unfortunately the native IBM encryption library is very slow. This might not be noticeable when encrypting or decrypting a small amount of data. But batch operations can be negatively impacted. The advent of AES encryption on the Power8 processor has done little to mitigate the performance issue with encryption. IBM i customers and third party vendors of FieldProc solutions should use caution when implementing FieldProc using the native IBM i AES software libraries. They are undoubtedly accurate implementations of AES encryption, but suffer on the performance front.

Key Management

An encryption strategy is only as good as the key management strategy, and it is difficult to get key management right. For companies doing encryption the most common cause of an audit failure is an improper implementation of key management. Here are a few core concepts that govern a good key management strategy:

  • Encryption keys are not stored on the same system as the sensitive data they protect.
  • Security administrators of the key management solution should have no access to the sensitive data, and database administrators should have no access to encryption key management (Separation of Duties).
  • On the IBM i system this means that security administrators such as QSECOFR and any user with All Object (*ALLOBJ) should not have access to data encryption keys or key encryption keys.
  • More than one security administrator should authenticate before accessing and managing keys (Dual Control).
  • All access to encryption keys should be logged and audited. This includes use of encryption keys as well as management of keys.
  • Encryption keys should be mirrored/backed up in real time to match the organization’s standards for system availability.

Encryption Key Caching

Encryption keys are often used frequently when batch operations are performed on sensitive data. It is not unusual that a batch program would need to perform millions or tens of millions of encryption and decryption operations. While the retrieval of an encryption key from the key server may be very efficient, performance may suffer when keys need to be retrieved many times. This can be addressed through encryption key caching in the local environment.

Secure key caching should be performed in separate program modules such as a service program and should not be cached in user programs where they are more subject to discovery and loss. Any module caching an encryption key should have debugging options disabled and visibility removed. Secure key caching is critical for system performance and care should be taken to protect storage.

Encryption Key Rotation

Periodically changing the encryption keys (sometimes called “key rotation” or “key rollover”) is important
to the overall security of your protected data. Both data encryption keys (DEK) and key encryption keys (KEK) should be changed at appropriate intervals. The appropriate interval for changing keys depends on a number of variables including the amount of data the key protects and the sensitivity of that data, as well as other factors. This interval is called the cryptoperiod of the key and is defined by NIST in Special Publication 800-57 “Key Management Best Practices”. For most IBM i customers rotation of data encryption keys should occur once a year and rotation of the key encryption keys should occur no less than once every two years. 

IBM i Encryption with FieldProc

Topics: Encryption, IBM i, FIELDPROC

Secure and Managed FTP on the IBM i (AS400) Platform

Posted by Patrick Townsend on Jul 7, 2016 3:39:40 PM

The File Transfer Protocol (FTP) has been with us since the dawn of the Internet. Amazingly it is still a critical component of electronic commerce and all large organizations use FTP for integration with their customers and vendors. As a critical part of your electronic commerce infrastructure you want to make sure that your FTP solution is reliable, secure, automated, and manageable. That’s where Managed FTP solutions come into play. Our Alliance FTP Manager falls into this category and helps IBM i (AS/400, iSeries) customers meet this critical need.

Click to view Secure Managed File Transfer Webinar for IBM i users In this blog I want to look at just the security components of a Managed FTP solution. In a future blog we’ll look at the management components in more detail. But let’s start with security!

Secure Transfer Methods

Of course, we need to be sure that we are securing all of our FTP operations with strong encryption. Older FTP protocols did not encrypt FTP sessions and left organizations exposed to data loss both inside and outside of the corporate network. All of that is changed now. There are two types of secure, encrypted FTP methods in wide use:

  • Secure Sockets Layer FTP (SSL FTP, or sometimes FTPS)

  • Secure Shell FTP (SFTP)

SSL FTP is an extension of the original FTP protocol and is an Internet standard. As the need for secure eCommerce increased in the early 2000s the SSL FTP transfer method gained traction and large organizations transitioned to this secure and encrypted transfer method. Unfortunately, SSL FTP was difficult to implement in typical corporate networks and required modifications to firewall configurations. The complexity of the SSL FTP method made it difficult and expensive to implement and manage.

Secure Shell FTP, or SFTP, is a part of the Unix and Linux Secure Shell set of applications. While originally a Unix application, Secure Shell is now available on a wide set of operating systems and platforms. SFTP provides a secure implementation of file transfer and is much more friendly to the corporate network and network administrators. For this reason most organizations have transitioned to SFTP for their secure and encrypted file transfer needs.

While other open and proprietary solutions exist to transfer files, SSL FTP and SFTP remain the dominant methods of secure file transfer for ecommerce.

Additional Security Requirements

In addition to secure and encrypted transfer of files, a good managed FTP solution provides additional security controls. Let’s take a look at the ones you should find in a managed FTP solution:

File encryption: Many people are surprised to learn that encrypting a file transfer session is not an adequate level of security. When a file arrives at its destination it should also be protected at rest. This means encrypting the file before it is transferred with SFTP or SSL FTP. But doesn’t this mean the data is doubly encrypted? Yes it does. But protecting the file after it is transferred is crucial to a security strategy. Most organizations use Pretty Good Privacy (PGP) to encrypt a file before transfer, and to decrypt files that are received. Your Managed FTP solution should natively integrate PGP encryption into file transfers.

Configuration access control: Configuring managed FTP transfers involves setting local and remote configuration parameters, encryption parameters, and many other aspects of file transfer operation. Your managed FTP solution should implement configuration access controls and notify you of an attempted violation.

Two Factor Authentication (2FA): Control over the administrative functions of a Managed FTP solution should include Two Factor Authentication. This is now a requirement for administrative access to payment card systems by the PCI Data Security Standard (PCI-DSS), but is also a security best practice for any critical system. Be sure your Managed FTP solution provides for 2FA or that you implement 2FA on the IBM i system level.

Compliance audit: Sending and receiving files that contain sensitive data requires that you retain a clear file transfer history. This is a minimal level of audit reporting and you will want to be sure your Managed FTP solution provides clear and easy to read audit trails.

System logging: Actively monitoring your system is a critical security control. On the IBM i server it means monitoring security events and transferring them in real time to a log collection server, or better yet, to a SIEM solution. FTP is often the mechanism by which cyber criminals steal information from your system, so a Managed FTP solution should be logging file transfers to the IBM security audit journal QAUDJRN. The security audit journal provides an un-modifiable repository of security events, and your file transfer information should be recorded there. Look for this feature in your Managed FTP solution.

Software updates and patching: Secure FTP protocols are periodically subject to the need for security patching. A recent security flaw in the SFTP protocol required updates for all systems that implement this Secure Shell protocol. Fortunately, on the IBM i platform IBM provides the SSH implementation as a no-charge licensed product, and updates are available through normal system patching procedures. Be sure that your Managed FTP solution integrates with the IBM solution, or that the Managed FTP vendor has an adequate strategy to provide you with security updates.

Backup and Recovery: As the new EU General Data Protection Regulation (EU GDPR) correctly points out, backup and recovery is a part of your security strategy. If you can’t recover from a system failure in a reasonable period of time you risk losing data that is critical for your customers and employees. We hold that data in trust for them, and protecting it also means resiliency in the event of system failures. Be sure your Managed FTP solution fits into your backup and recovery strategy for the IBM i platform.

These are critical security components of a Managed FTP solution. Some organizations we work with transfer thousands of files every day. I believe we’ve addressed the core security requirements in our own Alliance FTP Manager solution and we continue to invest in R&D to make these features better going forward. I will address other aspects of Managed FTP in future blogs.

Patrick

Webinar: Secure Managed File Transfer on IBM i

Topics: Managed File Transfer, IBM i, Secure Managed File Transfer, FTP Manager for IBM i

AES Encryption Performance on the IBM i (AS/400, iSeries):

Posted by Michelle Larson on Feb 8, 2016 7:39:00 AM

Understanding the basics can help you avoid problems! 

As enterprise customers deploy data security solutions to meet various compliance regulations (PCI DSS, HIPAA, etc.), they are frequently surprised by the performance impacts of encryption. Inadequate preparation in an encryption project can lead to increased costs, delayed (or even failed) projects, inability to meet compliance requirements, and even exposure in the event of a data breach. AES Encryption 30-day free trial

By its very nature, encryption and decryption are resource intensive processes. Enterprise customers can be surprised to discover that encryption from one vendor can perform very differently than the very same encryption from another vendor. While the various vendor solutions accomplish the same tasks, they vary greatly in how efficiently they do these tasks. The differences can vary by a factor of 100 or greater! This can have a large impact on business applications that perform encryption and decryption tasks. One vendor’s solution may encrypt a data in 10 minutes, and another vendor’s solution may take 10 hours to perform the same task!

Avoid surprises, ask for performance metrics:

Armed with the knowledge that encryption performance is important, you can take action to avoid potential problems. Before acquiring an encryption solution, ask your data security vendor to provide performance metrics for their solution. How long does it take to encrypt one million credit card numbers? Can they provide you with source code and demonstrate this performance on your server? Optimizing software for performance is a complex task and usually involves specialized technical talent and some experimentation with different computational techniques. Unless an encryption vendor is deeply committed and invested in encryption technologies, they may not make performance enhancements to their applications.

Create your own proof-of-concept applications that measure encryption and decryption performance in your application environment. Be sure to measure how well the encryption solution performs under your current transaction loads, as well as anticipated future transaction loads. A good rule of thumb is to be sure you can handle three times your current encryption volume. This will position you for increased loads due to unexpected changes in the market, or an acquisition of another company. It also insures that you are seeing real-life performance metrics, and not just the vendor’s marketing message.

Avoid hidden costs, ask for pricing calculations:

Ask your purchasing and accounting departments to include performance upgrade costs in the pricing calculation during vendor evaluation. Be sure these costs include any increases in software license fees. If an encryption solution consumes one third of the CPU processing power of a server, you might want to include the cost of upgrading to a processor twice as powerful as the one you have. Working these costs in during the product evaluation phase can provide a more realistic view of the actual cost of a vendor encryption solution. Upgrading hardware can lead to unexpected additional software costs. Some software vendors license their solutions to the number of processors, or speed of the processors, in your server. Upgrading hardware to solve a performance problem can result in increased software license fees.

Avoid red flags, not all AES encryption solutions are the same:

Some encryption solutions use “shadow files” (files external to your application) to store encrypted data. The use of shadow files normally indicates that the vendor has an incomplete implementation of the AES encryption suite, or that the system architecture is limited in some way. The use of shadow files can impose severe performance penalties. In order to perform an encryption or decryption task an addition file read or write is required which essentially doubles the file activity. This may also increase processor loads as your application mirrors the data to a hot backup system. You will want to be very careful in measuring the performance impacts of encryption solutions that use shadow files.

If an encryption vendor will not provide you with a fully functional evaluation of their solution, this represents a clear warning signal. Your application environment is unique and you will need to be able to evaluate the impact of encryption in your environment with a limited test. A vendor who refuses to provide you with a clear method of evaluating the performance of their solution may not have your best interests in mind.

Avoid frustrations, take a test drive with us:

Despite an organization’s best efforts, data will get out. The best way to secure sensitive information is with strong encryption that is NIST compliant and FIPS 140-2 compliant key management that meets or exceeds the standards in PCI, HIPAA/HITECH, and state privacy laws. For a more technical look at AES encryption, including FieldProc exit points and POWER8 on-board encryption, check out this blog by Patrick Townsend, Founder and CEO of Townsend Security: How Does IBM i FieldProc Encryption Affect Performance?

Our proven AES encryption solution encrypts data 115x times faster than the competition. But don’t just take our word for it, we provide a fully functional evaluation!  Request a free 30-day trial (full version) of our popular Alliance AES Encryption and see for yourself.

AES Encryption 30-day free trial

Topics: Alliance AES/400, IBM i, AES Encryption, iSeries, AS/400, Evaluation