Townsend Security Data Privacy Blog

Would Your Data Security Strategy Pass an Audit?

Posted by Michelle Larson on Dec 20, 2013 9:27:00 AM

Are You Confident You Are Meeting Compliance Requirements?

Why do we have so many different compliance regulations that affect our companies and our need to protect data? The fact is that there are people out there trying to access that sensitive information and devastating data breaches happen on a regular basis. While breaches are very difficult for companies that suffer the loss of customers, brand damage, and stiff financial penalties, it is the consumers and individuals who are most impacted by the loss of personal information, credit card numbers, or bank account numbers. Because these breaches happen and have such a catastrophic effect on individual people, state and federal and private regulations have been necessary to help motivate companies to try to protect that sensitive information and keep it out of the hands of those who would use it to commit the financial crime and fraud.

Webinar: Would your Data Security Strategy Pass an Audit?

Since most companies fall under a number of compliance regulations, here is recap of the most predominant points:

PCI Data Security Standard (PCI DSS) applies to merchants, public or private, who take credit cards for payment. While PCI DSS applies to payment cards, credit cards, and debit cards (anything to do with electronic payments) there are some core components of section 3.5 and 3.6 that require encryption and proper key management:

  • You must encrypt credit card numbers
  • You must use an industry standard encryption (AES)
  • You must provide proper management of encryption keys
  • You must have dual control, split knowledge, separation of duties

PCI section 10 requires logging:

  • Tracking user access to core resources
  • Collecting security events in an un-modifiable log
  • Consolidate the logs across all of our servers
  • Monitor them for potential breaches

HIPAA/HITECH Act covers the medical segment and any partner entity under thefederal law has to comply with data protection for protected health information (PHI) of patients and must meet requirements about protecting patient information and PHI. The most recent meaningful use guidance was very clear that organizations who fall under HIPAA/HITECH must protect patient health information and we must use proper key management as a part of any encryption strategy. They were quite blunt when they said ‘don't store encryption keys on the device with protected data’... there is no gray area there!

GLBA/FFIEC applies to the financial industry (bank, credit union, trading organization, credit reporting agency). Gramm Leach Bliley Act sets standards for protecting information and consumer information. The FFIEC is responsible for publishing guidance, actually performing audits, and enforcing the standards set by GLBA around encryption and key management best practices.

Sarbanes-Oxley (SOX) applies to public traded companies (section 404 - information technology and data protection for stakeholders). SOX provides detail around data protection, guidance around cryptographic key management, and security requirements for data management. They issue very strong guidance for encrypting sensitive data of personally identifiable information (PII) that is being managed by a publicly traded company. SOX closely mirrors the National Institute of Standards and Technology (NIST) which publishes best practices guidance for encryption key management, key management lifecycles, and logging.

In the United States we have a number of state privacy laws, some of them mandate encryption, others strongly recommended it. These laws apply to both public and private organizations of all sizes and provide guidance for breach notification and penalties around data loss. There is a wide recognition that protecting data using industry-standard encryption and proper key management is a basic common safe harbor from having to do breach notification. Additionally there is a proposed federal privacy law that will eventually replace the individual state laws.

What elements do all of these regulations have in common?

  • All are expecting organizations to secure personally identifiable information (anything that can be actually used to individually and specifically identify somebody) with encryption or tokenization and actively monitor their systems
  • Laptops, mobile devices, removable storage, tape archives, or backup archival files must be encrypted
  • Requirements that vendors, business associates, and service providers must meet the same regulations of the industry they are serving
  • Multiple regulations may apply to one company (ie: a doctors office that takes credit card payments would fall under PCI DSS and HIPAA/HITECH)

One of the biggest points of audit and compliance failure is around the encryption key management strategy. While compliance regulations do not mandate FIPS 140-2 validation on a key management solution, auditors will red flag encryption or key management that's not industry-standard. They're looking for certifications like NIST validation of AES libraries or other encryption components and FIPS 140-2 validation of key management solutions. Once you encrypt your data with AES, encryption keys are the secret that you must protect. The nature of an encryption key is that it is unique to you.  It cannot be easily detected or reverse engineered from the data itself. Look to NIST for recommendations about how to manage the creation and lifecycle of an encryption key (when it should be rotated or changed).

What do auditors look for in certifications and standards?

  • Standards-based encryption (AES)
  • FIPS 140-2 validated key management
  • Security best practices of dual control, separation of duties, and split knowledge
  • Policy based security

In terms of developing a data protection strategy, apply the best and strongest data protections provably based on industry standards and security best practices to all sensitive data and remember:

  • Regulations are getting more stringent and specific… not less!
  • Fines and penalties are getting steeper… not cheaper!
  • Define personally identifiable information (PII) broadly…not narrowly!

Also crucial when you begin to consider a data protection strategy and your data is moving across multiple operating systems, databases, and platforms is to look for a common approach to encryption and key management, it will be very helpful in reducing costs and maintenance over the long-term.

I’ve included a link to our recently recorded webinar, which focuses on the IBM i system, but is applicable across most platforms.  There is a great deal of detail and information on how we can help you address compliance regulations and the four core components of a data protection strategy (on the IBM i, or Windows, or Oracle, or a number of other platforms) for which Townsend Security provides solutions:

  • Encryption
    • Data at rest – AES Encryption
    • Whole file encryption with PGP
  • Tokenization
  • Encryption Key Management
  • Secure System Logging

Webinar: Would your IBM i Pass an Audit?  

Please request the webinar download!

Topics: Compliance, Data Security, IBM i, Encryption Key Management, Webinar

3 Critical Best Practices for Encryption Key Management on the IBM i

Posted by Liz Townsend on Oct 7, 2013 1:35:00 PM

Patrick Botz, founder of Botz and Associates and former Lead Security Architect at IBM, recently published a White Paper in conjunction with Townsend Security discussing dual control, split knowledge, and separation of duties--three critical controls needed to protect encryption keys and encrypted data on the IBM i platform. These controls are considered “best practices” in the IT industry, and it is common knowledge amongst security professionals that without these controls in place, any organization could be at risk for a major data breach.

Key Management for IBM i - Audit Failures

Just like financial controls that are put in place to prevent fraud in a business, these concepts are used in IT security to prevent data loss. As data breaches are reported in the news almost every day, we can easily see the consequences of data loss: public scrutiny, hefty fines, lost business, and litigation are just a few of the ramifications. Implementing these controls reduces the potential for fraud or malfeasance caused by the mishandling of data or a data loss event due to hackers, employee mistakes, or stolen or lost hardware.

In this white paper Patrick Botz outlines the importance of these three controls and explains why they must be used to protect data stored in IBM i databases. Botz discusses on-board master key capabilities provided by the IBM Cryptographic Services APIs on an IBM i, the limitations of the IBM i Master Key Facility, and why organizations should use third-party key management to protect their sensitive data.

The top 3 critical best practices are:

Separation of Duties - This is widely known control set in place to prevent fraud and other mishandling of information. Separation of duties means that different people control different procedures so that no one person controls multiple procedures. When it comes to encryption key management, the person the person who manages encryption keys should not be the same person who has access to the encrypted data.

Dual Control - Dual control means that at least two or more people control a single process. In encryption key management, this means at least two people should be needed to authenticate the access of an encryption key, so that no one single person has access to an encryption key

Split Knowledge - Split knowledge prevents any one person from knowing the complete value of an encryption key or passcode. Two or more people should know parts of the value, and all must be present to create or re-create the encryption key or passcode. While split knowledge is not needed to create data encryption keys on the IBM i, it is needed for the generation of master keys which are needed to protect data encryption keys. Any encryption keys that are accessed or handled in the clear in any way should be protected using split knowledge.

The three core controls should always be used when storing or transferring encrypted sensitive data. A certified, hardened security module (HSM) designed to secure data encryption keys and key, or master, encryption keys should implement these controls into the administration of the key manager. NIST FIPS 140-2 validation is an important certification to look for in an encryption key manager. This certification ensures that your key manager has been tested against government standards and will stand up to scrutiny in the event of a breach.

Automatic Encryption on V7R1
With the release of IBM i V7R1, users can now encrypt data automatically with no application changes. This is great news for IBM i users since encryption has been a difficult task in the past, needing specialized encryption solutions for earlier versions of IBM i. Protecting your encryption keys in a an external key management HSM is the critical next step to protecting your encrypted data.

To learn more about encryption key management for the IBM i download the full White Paper “Encryption Key Management for IBM i - Sources of Audit Failures,” by IBM i security experts Patrick Botz and Patrick Townsend.

Key Management for IBM i - Sources of Audit Failures

Topics: Separation of Duties, Patrick Botz, Split Knowledge, IBM i, Encryption Key Management, White Paper, Dual Control

Signs Your IBM i May Have Been Hacked - part 2

Posted by Michelle Larson on Oct 3, 2013 9:20:00 AM

As we discovered in the blog Signs Your IBM i May Have Been Hacked, the combination of secure system logging on the IBM i and log monitoring with a SIEM will help you secure sensitive data and minimize the impact of security breaches. Signs Your IBM i may have been Hacked  Hopefully you were able to watch the webinar resource provided (if not, you can request it HERE).  After the webinar, we had a number of questions asked by attendees and answered by industry experts from Townsend Security and Integrity.  Here is a recap of that Q&A session:

Q: Do compliance regulations require system logging?

A: Most regulatory compliance standards such as PCI-DSS, FISMA, GLBA, and HIPAA/HITECH require organizations to monitor their network in real-time and provide audit reports. For the Payment Card Industry Data Security Standard (PCI-DSS), there are numerous logging requirements to be PCI compliant. Auditors want to look at how the logs are generated, whether it’s systematic or whether an operator can access/edit them, go in and pull them off and move them somewhere else. They want to look at if there’s mirrored events, where they go off the system through an automated process without any potential human intervention. It also details if people have the right privileges. Logs will show user events as well as what individuals are accessing libraries, files, or other areas outside of their designations. Logging is not only an industry best practice, it is a critical control to understanding access to a system.

Q: We have some custom applications that run our core business. Can a SIEM solution analyze the log files that come from these applications?

A: Dave Nelson from Integrity answers “Some SIEM applications are able to analyze log files from custom applications, others are not. Integrity’s SIEM can create a custom parser that can take just about any log that you can provide. Integrity can analyze that, we’ll work with your internal application development staff to identify what different error codes or security event log codes or whatever it is that you’re creating to identify a specific event. We can map that then into the parser then we can map those to either standard alerts or we can create new custom alerts, we can customize thresholds and a lot of different things. That’s one of the reasons that our customers choose us most frequently is because they have those internal applications that are custom that a lot of the other SIEM tools can’t handle, but we can handle and we can give them a lot of information about something that’s very unique to their business.”

Q: You mentioned File Integrity Monitoring (FIM), can you further explain how an organization would use it?

A:  It’s not every field that you’re going to want to alert and log and monitor on, but there might be ones with credit card numbers or store order authorization codes that you want to monitor and make sure the data hasn’t been altered or accessed without consent. The point to stress with logging and file integrity monitoring is ultimately it helps the individual system operator. You can have mirror alerts go to multiple people in the company, security officers as well as system operators. With FIM you take responsibility off of any one person having to follow up and do it all and you can create more of a collective team that analyzes this data to help the business.

Q: How can we distinguish a false alarm from a successful attack?

A: Sometimes it can be very difficult to determine a false alarm from a successful attack until you have done an entire investigation.  People that do this day in and day out and can begin to identify the patterns and trends of what makes an attack successful or not.  In our experience, the easiest way to do it is to look for key data points or key events that should have happened. One of the things you can do is jump right to the end if you know that a specific attack is successful, and work your way back through the system to determine the file name and creation date.  This really only comes with experience and practice of identifying the missing pieces.

Please post any additional questions you may have here on the blog!

For a much deeper and more detailed discussion on secure system logging and monitoring as essential controls to detect and mitigate the risk of a data breach, please request a download of the entire webinar:

Learn the importance of system logging and monitoring

Topics: System Logging, File Integrity Monitoring (FIM), IBM i, Alliance LogAgent, Data Breach, Integrity

Signs Your IBM i May Have Been Hacked!

Posted by Michelle Larson on Sep 24, 2013 3:40:00 PM

(Based on a recent webinar with Townsend Security and Dave Nelson, President of Integrity)

Your IBM i may have been hacked and you don’t even know it yet!

Industry experts from Townsend Security and Integrity discuss how the combination of secure system logging on the IBM i and log monitoring with a SIEM will help you secure sensitive data and minimize the impact of security breaches. Signs Your IBM i may have been Hacked Topics cover (and go beyond) how log files and log data are the digital evidence (artifacts) that actually take us to a point of action within a system. They look at what the false alarms are within the plethora of data and how to screen those out. Then they also talk about the next steps: What are the red flags to watch for, and what to do with those red flags.

“As we look at the millions of data points that are created each day, every login or logout, every time a user is created, every time a user accesses a resource or adds a new resource or saves a file…. amidst all that data, hacking events happen. What we have to try and do is understand the ways that we can sift through that data and reduce the background noise and address the successful attacks.” (Dave Nelson)

Things to look for in log files as we’re trying to identify what’s real data, false alarms, or red flags:

New users and user accounts - Look for things like random names (like BSX or BS4XOR) and be able to identify new users. Always be able to trace these new user accounts back to a user account request and be able to identify which of those accounts have an approved resource and which ones have not.

New files and directories - Identify new directories, look for batches of files that show up between things that are normally next to each other. One of the things hackers love to do is hide files on any sort of Windows mountable or UNIX mountable directories within your i Series because a lot of times the IBM i doesn’t have an antivirus check or an antivirus application on it.

Date and time stamps - There are some (system) files that you know shouldn’t change. If you start to notice that those file modification dates or the save dates on those files and libraries have changed, that should start to be a red flag.

Significant increase or decrease in the size of a file or a library - Hackers will inject data into the back end of an existing file so that the file itself doesn’t change and it can still be executed. So watch for files that used to be a few kilobytes and are now a few megabytes or even gigabytes.

New processes or services that are running - Anytime you have a batch job that’s running and you’re not familiar with it, that should be something that you look at right away. Look for unusual interactive jobs working between LPAR’s or between systems. Do you normally have data leaving your IBM i and going to another platform? or a direct connection from a Windows server directly into your IBM i?

Cryptic or unusual file names - Create some sort of naming convention within your organization so that you know if something is outside of that standard.

It is suggested that we think of log files as the forensic evidence for the IBM i system and think about monitoring almost as a crime scene investigation. The relationship between the logging agent and the collector of those logs is very important because unexplained system value configuration changes, application changes, changes to privileges and privileged user profiles are indicators of potential malicious activity that you can record. These logging tools are strengths for an organization to really get to know what the system is doing as part of daily business activity, and then how to alert and monitor for data protection.

With all the different types of data that you can look for, the sheer volume of information that’s out there, there’s absolutely no way that an individual system administrator and application developer, even a full time security professional is going to be able to sift through that amount of information. Partnerships between the SIEM (Security Information and Event Management) collector and the logging agent are now industry standard defense and depth controls. Automation and email notifications about potential malicious activity can immediately give you the chain of custody to provide the digital evidence you require to go investigate further. You want to be able to drill down to specific threats, events, and user specific events as part of any good governance risk & compliance program and risk management approach. Essential for a total enterprise solution is the partnership (and strong encryption) between LogAgent and a SIEM.  

As a SIEM solution that partners with Townsend Security’s logging solution*, what Integrity’s done differently is provide a managed SIEM service. Dave tells us We’ve got clients running this on the i Series platform using Alliance LogAgent to monitor, interfacing with our SIEM, and  they have said ‘Wow, we didn’t have any idea that we could get this much information and that it could be this easy to access and that we can share it’.  Clients want to be able to share that with their network administrators and say ‘See, this is what we’re seeing, we’re seeing this traffic and we don’t know why it’s coming in, can you please stop it and block it’.  One of the best things about Integrity’s SIEM solution from a cost perspective is that there’s no capital investment. You don’t have to spend $100,000 for the software, $50,000 for hardware and then go out and hire a full time person to review these logs and to set up the system and manage another system and application within the environment. It’s all provided for you for a low monthly cost. You get this in a matter of days and weeks instead of a matter of months. So you’re getting immediate return on your investment. In these economic times we all know how important that is to be able to show ‘Hey, we’re getting some real value for this expenditure that we’re making, we’re seeing a lot of things happening’. One of the other benefits is that you’re not going to see just security information from this. The amount of information that you’re going to get, you’re going to see operational things that you hadn’t seen in the past. You’re going to see things that you look at and say ‘Wow, we had no idea the system was operating that way, or those processes were running, or those jobs were running or taking so long to run’. The feedback that we get from our clients is that the value they get from the operational side of the SIEM is almost, if not as much, as what they get from the security side of the SIEM. So just being able to see deeper into the environment and seeing what’s happening, what’s going on has been great for a lot of our clients as well.”

*Townsend Security’s Alliance LogAgent is a comprehensive platform specific solution for IBM i which helps cut through the noise and deliver granular valuable data, providing file integrity monitoring right down to field level changes. Key steps you want and need for compliance purposes as well as data security.  

For a much deeper and more detailed discussion on secure system logging and monitoring as essential controls to detect and mitigate the risk of a data breach, please request a download of the entire webinar:

Learn the importance of system logging and monitoring


If these technologies are not in place, do you really know you haven't been hacked?


Topics: System Logging, File Integrity Monitoring (FIM), IBM i, Alliance LogAgent, Integrity

Understanding System Logging on the IBM i – Part 2 – Webinar Q&A

Posted by Michelle Larson on Jul 26, 2013 8:39:00 AM

As we discussed in the blog Understanding System Logging on the IBM i – Part 1, secure system logging on the IBM i (AS/400) can not only help you meet compliance requirements, it can help you stop a data breach before it happens!  Hopefully you were able to watch the webinar resource provided (if not, you can request it HERE).  After the webinar, we had a number of questions asked by attendees and answered by security expert Patrick Townsend. System Logging on the IBM i

Here is a recap of that Q&A session:

Q: Can you pick which group of users to audit?

Patrick: In our current version of Alliance LogAgent, our IBM i system logging solution, you can define a list of “excluded users.”  So you can actually tailor events on the IBM i platform and exclude particular user profiles.  We also provide some basic filtering capability allowing you to filter based on IBM event type and on User event type.  So yes, there is a fair amount of filtering capability in the product.  From a security posture, it would be a mistake to filter out significant security events, however the solution allows you to easily choose and select the events you want.  We have an option that maps straight to the security system values so you can easily set LogAgent to process those or you can establish and select any and all of the event types in the product that you want to monitor.

Q: With all the different IBM system events, is it possible to set up filtering of just the IBM security event types?

Patrick: Yes, In our solution, collection of the system security events is set as the default “out of the box” setting.  You have total control over the events you collect, and with just a few keystrokes, you can re-map your collection and filter in additional events, or leave it set to what IBM identifies as the security type events being collected in QAUDJRN.

Q: Can File Integrity Monitoring (FIM) also monitor read access to a database file? What about a user that comes in from another server doing just a select in SQL?

Patrick: The File Integrity Monitoring component in LogAgent is based on a database journal. A record-level read event is not collected in a database journal; but a file open and closed event is collected.  So if a user does access a file through read and there’s no update, insert, or add type of event, it may not appear in the journal. We capture all the information that the system makes available to us in the file integrity monitoring component on significant changes or access to any protected file.

Q: You mentioned File Integrity Monitoring. Can you further explain how an organization would use it?

Patrick:  File Integrity Monitoring is designed to monitor configuration changes and highlight them for early detection.  Generally, when an attack happens on an application it often involves changes to configuration files in an attempt to escalate privileges and give the attacker more authority or access than they should have.  For example if a new user is suddenly granted authority in the HR application to print checks, that is an important indicator of a potential problem!  Another common use for File Integrity Monitoring is to monitor the use of stored sensitive information (credit card information or social security numbers) on your IBM i platform, you would want to use FIM on those databases or applications in order to strengthen your security stance and identify those potential attacks early on.

Q: How do you keep QAUDJRN entries from flooding the network?

Patrick: With the large volume of events that can be collected by QAUDJRN, you can really help minimize network impacts by filtering out events that are not security related.  You can still collect those events, but exclude them from being transmitted to your SEIM console.  Because LogAgent works in real time it helps reduce the impact on your network because you are not transmitting millions of events all at once. We also provide metering capabilities so you can reduce the number of events per second that are being transmitted.

Q: How do these logs get stored?

Patrick: When it comes to log storage, you want them off of your IBM i platform as quickly as possible to avoid potential tampering or corruption.  With LogAgent, we do not make a copy of the events, we take the filtered events out of QAUDJRN in real time, format and transfer them to your log collection server or SEIM.  Since some compliance regulations require the events are stored for a defined period of time, SEIM consoles compress, store, and protect those log events so you have the ability to do forensics, queries, and reporting on them.

To learn more, view our webinar "Understanding Log Management on the IBM i" which examines the security principles, compliance requirements, and technical challenges for log collection and forwarding on the IBM i platform with the following learning objectives:

  • Security principles of log collection and monitoring
  • Compliance requirements of PCI DSS, HIPAA/HITECH, SOX, GLBA/FFIEC
  • Communicating with log collection and SIEM servers
  • File Integrity Monitoring and log collection
  • IBM i log collection challenges

DOWNLOAD WEBINAR Understanding System Logging  

If you have further questions, please list them here in the comment section and we will be sure to get you an answer!

Topics: System Logging, File Integrity Monitoring (FIM), IBM i, Alliance LogAgent

Understanding Log Management on the IBM i: Part 1

Posted by Michelle Larson on Jul 12, 2013 8:30:00 AM

Secure system logging on the IBM i (AS/400) can not only help you meet compliance requirements, it can help you stop a data breach before it happens!  Intruders may start with a password hack that gives them access deeper into the system.  There is usually a long trail, visible within system logs. Everything from the original breach can be detected and identified with proper monitoring of the system logs.  What really is driving the need to collect and monitor system logs centers around how often breaches are easily detected with log management. System Logging on the IBM i  For example:

  • Less than 1% of the breaches were discovered through active log analysis
  • Forensics showed 69% of these breaches were detectable via log evidence
Compliance regulations require (or strongly recommend) system logging. Do you know which of these apply to you and your company?

PCI Section 10 requires logging for anyone who collects credit card data

Requirement 10:  
 “Track and monitor all access to network resources and cardholder data”

    • 10.1 Establish a process for linking all access to system components (especially access done with administrative privileges such as root) to each individual user.
    • 10.2 Implement automated audit trails for all system components to reconstruct the following events:
    • 10.3 Record at least the following audit trail entries for all system components for each event:
    • 10.4 Using time-synchronization technology, synchronize all critical system clocks and times
    • 10.5 Secure audit trails so they cannot be altered.
    • 10.6 Review logs for all system components at least daily.
    • 10.7 Retain audit trail history for at least one year, with a minimum of three months immediately available for analysis.

GLBA / FFIEC recommends data security logs of actions that could affect financial reporting or fraud for financial institutions.

    • Network and host activities typically are recorded on the host and sent across the network to a central logging facility.
    • The logging facility may process the logging data into a common format. That process is called normalization. Normalized data frequently enables timely and effective log analysis.

(This Link provides more information about FFIEC recommendations for logging)

HIPAA / HITECH ACT requires system logs of access to Protected Health Information (PHI) in the medical sector

    • LOG-IN MONITORING (A) - § 164.308(a)(5)(ii)©

…the covered entity must implement: “Procedures for monitoring log-in attempts and reporting discrepancies.”

    • Access controls - § 164.312(b)

(section b) Standard: Audit controls. Implement hardware, software, and/or procedural mechanisms that record and examine activity in information systems that contain or use electronic PHI.

There are other compliance regulations and protocols that apply, but they all say about the same thing … you should be collecting system logs, you should be monitoring them, and you should take action based on anomalies that you find in them.  It is not enough to assert that you are doing the right thing; you have to be able to prove it with system logs that are independent from the original system files and verifiable.

System logging is important across all operating systems, but we are going to look at IBM i with greater detail due to it’s complexity.  Because the IBM i system can handle multiple applications, it doesn’t log information like others do.  The IBM i collects logs simultaneously from multiple sources and deal with large volumes: Up to 3,500 events per second…250 Million of events per day!  The essence of good reporting is externalizing the systems logs and collecting them in a central repository which helps remove the risk of tampering. Compliance regulations recognize the need to watch all users – including the most powerful users, because network originated threats to the IBM i are often not noticed or quickly responded to by IT security professionals without close monitoring of system logs.

Creating the QAUDJRN (Security Audit Journal) on the IBM i

QAUDJRN is not created or enabled by default on the IBM i platform.  If you have not set it up, you are not yet collecting system logs.  To implement system logging you create the journal and journal receiver, then set system values that control options about what information is collected. Once the values are set, the collection process begins.  QAUDJRN is non-modifiable and date-stamped and a large amount of useful information can be collected in each event.  However just running system log reports on the security audit journal are not enough. Centralizing events and monitoring them off the IBM i platform are crucial. The events need to be consolidated and correlated in a separate location (usually a SIEM Console) in order to see the whole picture and understand potential attacks on your system.  

Take Away:
If you are properly collecting and monitoring your system logs, you can detect a breach before data is lost.

To delve deeper into this topic, we are sharing this newly recorded webinar in which, security expert Patrick Townsend talks about system logging on the IBM i today and how the capabilities of Alliance LogAgent can provide you with a high performance, affordable solution that will communicate system logs securely between your IBM i and Security Information and Event Management (SIEM) Console.

DOWNLOAD WEBINAR Understanding System Logging

As always, we welcome your questions and comments posted here!

Topics: System Logging, HITECH, IBM i, Alliance LogAgent, HIPAA, PCI, GLBA/FFIEC

Three IBM i (AS400) Security Tips You Need to Know

Posted by Liz Townsend on Jul 3, 2013 9:35:00 AM

Over the past two years the IBM i 7.1 (V7R1) has come to be known as a powerful, reliable, and highly scalable solution for businesses. IBM i V7R1 supports total integration and virtualization with new encryption capabilities that are appealing to many companies who must comply with data security regulations such as PCI and GLBA/FFIEC. This new exit-point feature, called field procedures (FIELDPROC), helps businesses to encrypt their sensitive data at column level without any application changes in order to meet compliance regulations and protect data from hackers. Top Security Tips Podcast

This is great news since data breaches have become painfully common. Despite the staggering amount of data breaches that happen every month, a new study has shown that nearly 70% of data breaches could have been avoided had the proper security measures been implemented.

Patrick Botz of Botz and Associates recently joined our founder and CEO, Patrick Townsend, in an interactive webinar that focused on security tips both he and Patrick recommend. Patrick Botz is an expert on data security and data breach prevention. He held the position of lead security architect at IBM and was the founder of the IBM Lab Services security consulting team.

Here are the top three security tips for users securing sensitive data in IBM i V7R1 and meeting data security regulations according to Patrick Botz and Patrick Townsend:

1. Use Encryption & Encryption Key Management Best Practices - Encryption is the tool that protects your data. If you do your encryption poorly, there’s really no point in doing it at all.  In order to do encryption well you must follow best practices for encrypting data and managing the encryption keys. These best practices include: using AES encryption certified by the National Institute of Standards and Technology (NIST) and key management certified under the FIPS 140-2 standard; and using key management that utilizes controls such as separation of duties and dual control. Your encryption is only as good as your key management. If you follow best practices for encryption and encryption key management, you are also more likely to avoid having to report a data breach and deal with the severe costs.

data security2. Use Password Best Practices - Password management is often the downfall of many companies who suffer a data breach, especially a data breach that happens internally or by mistake. Patrick Botz specialized in password management and has enabled IBM i users to manage their passwords more securely with his Single SignOn (SSO) service, SSO Stat! Using a program called Kerberos, SSO works with both Windows and IBM i domains to streamline password use in a secured environment.

3. Monitor Your IBM i with System Logging - A crucial step to achieving good data security, receiving important system logs in real time and using a SIEM solution can help a database administrator prevent or catch a system breach as soon as it happens. System logging is also a critical part of meeting most compliance regulations. One challenge around system logging on the IBM i, however, is that security audit journal, QAUDJRN, is in a proprietary IBM format. In order for these logs to be centralized and correlated with other logs in your server environment, these IBM logs must be translated into a useable format.  File integrity monitoring (FIM) is also important to monitor configuration changes. Townsend Security’s Alliance LogAgent provides file integrity monitoring and translates all of your logs into a single usable format that can be read by your SIEM provider.

Encryption, encryption key management, password management, Secure System Logging and File Integrity Monitoring are all absolute necessities for a business to safely store their data, and avoid legal complications due to negligence.

Please check out our resources tab to find out more information. You can find us on Facebook, Twitter and LinkedIn as well as our website, www.townsendsecurity.com. Start better security today!

Podcast: Top IBM i Security Tips

Topics: Patrick Botz, IBM i

Three Most FAQs About Encryption Key Management on the IBM i

Posted by Michelle Larson on Jun 18, 2013 2:10:00 PM

The way organizations are managing encryption keys is falling under more scrutiny by Payment Card Industry (PCI) Qualified Security Assessor (QSA) auditors.  Companies must demonstrate they are enforcing dual control and separation of duties in order to protect sensitive data.  eBook - Encryption Key Management Simplified

Here are the answers to three of our most frequently asked questions about encryption key management on the IBM i:

Is it still effective to use an integrated key management solution that stores encryption keys in the same partition as the encrypted data?  
The short and simple answer is No. There are many reasons why storing an encryption key on the same server that contains protected data is not advisable. This is not just an IBM i issue - it spans all of the current major operating systems. Let's explore this a bit more in the following sections.

How do IBM i users manage encryption keys according to PCI requirements with an encryption key manager?
Payment Card Industry - Data Security Standards (PCI DSS) requirement states the following requirements for encryption key management:

  • Dual Control means that at least two people should be required to authenticate before performing critical key management tasks.

  • Separation of Duties means that the individuals managing encryption keys should not have access to protected data such as credit cards, and those that have access to protected data should not have the authority to manage encryption keys.

How are the “dual control” and “separation of duties” requirements achieved on IBM i?
On the IBM i you simply can't achieve these PCI requirements if you store the encryption key in the same partition as the protected data.  

The QSECOFR user profile (and any user profile with *ALLOBJ authority) will always have complete access to every asset on the system.  An *ALLOBJ  user can circumvent controls by changing another user's password, replacing master keys and key encryption keys, changing and/or 
deleting system logs, managing validation lists, and directly accessing database files that contain encrypted data.  

From the perspective of PCI, an integrated key management system puts too much control into the hands of any one single individual.
The only way to comply with PCI requirements for key management is to store the encryption keys off of the IBM i.  Take users with *ALLOBJ authority out of the picture completely.  When you use a separate appliance to manage encryption keys you can grant a user access to the protected data on the IBM i and deny that same user access to the key manager.  Now you have enforced separation of duties.  And with the right key management appliance you can require TWO users to authenticate before keys can be managed, and have dual control of encryption keys.

Now it’s time to ask yourself a few questions!

  • Is your organization encrypting data on IBM i?  

    • If so, how are you managing the encryption keys?

  • If you store the keys on a separate partition, have you had a recent PCI audit?  

    • What did your auditor say?

Download the eBook: Key Management SimplifiedIf you aren’t sure of the answers, or if this still seems foreign to you, take a few minutes to download our eBook "Encryption Key Management Simplified”.

Whether you are an IT administrator or a business executive, this resource will help you learn the fundamentals of:

  • What is encryption key management

  • Key management best practices

  • How to meet compliance regulations (PCI-DSS, HIPAA/HITECH, GLBA/FFIEC, etc.) with encryption key management

  • How encryption key management works on every platform including Microsoft SQL Server '08/'12, Oracle, and IBM i

  As always, we welcome your comments and suggestions!  Let us know what you think of the eBook! 


Topics: Key Management, Separation of Duties, IBM i, Encryption Key Management, Dual Control

IBM i Security: FIELDPROC, Encryption Key Management, and Compliance

Posted by Liz Townsend on Apr 29, 2013 2:30:00 PM

In October of this year, IBM will end support of V5R4 of IBM system i. This decision will force their customers running on V5R4 to upgrade to either V6R1 or V7R1. Many customers are currently in the process of or have already completed this upgrade. For IBM i administrators out there who have not yet begun this critical upgrade, it's important to know the differences between V6R1 and V7R1. The most notable difference is the new FIELDPROC capability offered exclusively in V7R1. Short for field procedure, FIELDPROC allows automatic, column level encryption in the DB2 database without any program changes.

FIELDPROC Encryption Patrick Townsend, CEO and Founder of Townsend Security, recently sat down with data privacy expert Patrick Botz at this year's COMMON exposition to discuss FIELDPROC, encryption key management, and what these changes mean for retail merchants who must comply with PCI-DSS. Here is an excerpt from that discussion:

Patrick Townsend: Patrick Botz, can you tell us why encrypting sensitive data is more important than ever, and how FIELDPROC can help IBM i customers easily encrypt sensitive data and meet compliance regulations?

Patrick Botz: I think encryption is something that we're realizing everyone should have been doing a long time ago. Today many businesses are required or recommended to encrypt sensitive data by data security regulations such as PCI-DSS, HIPAA/HITECH, GLBA/FFIEC, and many state laws. This is evidence that encryption is extremely important today, not just from a security point of view, but from a compliance point of view. FIELDPROC is an excellent tool that IBM has added in V7R1 that makes it easier for ISVs to provide efficient and easy to use encryption without having to change programs. This is huge for customers. In fact, I've worked with at least two customer groups so far who's primarily reason for upgrading to V7R1 is to be able to use products that use FIELDPROC.

Townsend: Jumping from V5R4 to V7R1 is a supported path, right?

Botz: Right!

Townsend: Patrick, I know that you're company, Botz & Associates, does a lot to help IBM i customers with their security projects, can you describe a typical  encryption project and how FIELDPROC has saved them time, money and aggravation in terms of getting the project done?

Botz: Yes, there is a pattern these projects tend to follow. Before they embark on their encryption project, the first discussion I have with and IBM i customers is to answer questions such as, how many programs am I going to have to change and how long is it going to take because we can't afford to have our systems down. Then when we start talking about the different products that take full advantage of FIELDPROC, and how they won't have to change their programs to do encryption with FIELDPROC. Once we get to that point, customers are ready to jump in and they're excited! The next step is to discuss if they want to encrypt just the fields with personally identifiable information (PII) or the whole database. From that point on it's a pretty easy process to get data encrypted.

I see many IBM i customers trying to do their own encryption, and one of the things I say to people is, "Have you heard the phrase 'it's not rocket science'? Well, with encryption, to make sure you get it right, it approaches rocket science." The fact is that customers really need to pick a solution that handles not only the encryption, but the key management as well. In my opinion the most important part of encryption is key management. I like to use the analogy of using a padlock: If you buy the world's best padlock for your backyard shed and then you pound the nail on the shed right next to the padlock and hang the key there, is that best padlock doing you any good...

In case you missed the presentation by Patrick Townsend and Patrick Botz, we recorded their session and have made it available for online listening. Download the podcast "FIELDPROC Encryption on the IBM i" to learn more about:

-Encryption Key Management with FIELDPROC
-The importance of certifications
-And what QSA and compliance auditors will look for in your key management system

Patrick BotzPatrick Botz is an internationally known information security expert, specializing in security as a business requirement first, and as technology second. His passion for SSO began while working at IBM which he joined in 1989. He held several positions at IBM, including  Lead Security Architect and founder of the IBM Lab Services security consulting practice. He architected the SSO solution for OS/400 and i5/OS, and he holds several security-oriented patents.

Topics: Encryption, IBM i, FIELDPROC

Help! Do I Upgrade My IBM i or My Software First?

Posted by Kristie Edwards on Apr 15, 2013 1:40:00 PM

Nearly every week I get asked the same question about IBM i (AS/400) upgrades by our current Alliance AES/400 and Alliance FTP customers who are in the process of updating their operating system:

Top IBM i Security Tips

Q: “I am running Alliance FTP Manager with Commercial PGP.  Do I need to upgrade this to your latest version before I upgrade my IBM i to V7R1?”

A: Always upgrade the software you are currently running to the latest version first, then upgrade your operating system second.  IBM made changes in V6R1 and V7R1 and we have builds specific to these Operating Systems.  If you upgrade your software first to the most recent release, you will avoid larger issues around transferring your data when you upgrade to one of the newer IBM i versions.

Another related question I’m often asked is:

Q: “Should we do a system backup before a major OS update?” 

A: Yes, you should always do a full system backup before making changes to your system. System upgrades and other changes to your OS are often a challenge and can result in data loss, system crashes, and other major issues if you’re not careful. You never know what could happen, and you don’t want to be left piecing your system back together while your customers are waiting. Because it’s such a critical component we always remind our customers to backup their current application library. If something does go wrong during the upgrade, you’ll want the option to revert to the backup.

To help with these sorts of issues we provide a customer service portal with an extensive list of solutions and frequently asked questions. 24/7 support is often an important need for many customers upgrading their IBM i or other operating system, and we provide that as well. We know that planning a move like this takes lots of time and if we can help, we are happy to assist.

To learn more about IBM i security, check out our recent webinar, TOP 3 IBM i SECURITY TIPS FOR 2013, featuring data privacy experts Patrick Townsend and Patrick Botz.

Topics: IBM i