Townsend Security Data Privacy Blog

Patrick Townsend

Recent Posts

Encrypting & Protecting Medical Data – Some Thoughts Before HIMSS

Posted by Patrick Townsend on Feb 13, 2012 1:00:00 AM

Breach Notification Safe-Harbor

PCI Compliance White Paper

Download the white paper "Achieve Safe-Harbor Status from HITECH Act Breach Notification" to learn more about encyption and key management best practices.

Click Here to Download Now

Anyone who works with software applications in the medical segment is painfully aware of the complexity of patient information.  You mix a lot of personal information about patients, their family, their care givers, diagnostic information, pharmaceuticals, and insurance providers together, and you get a witches brew of data that would make your head spin.

Mix in a rapidly changing regulatory environment and you’ve really got a headache!

Medical organizations and application vendors have a lot on their plates keeping up with all of this, and now with new Electronic Medical Record (EMR) requirements coming into effect, they have to become experts in encryption technologies to protect patient information.

The lights are blinking red; system overload!

We’ve been helping medical organizations meet their data protection requirements with our encryption and key management solutions for several years. Our commitment to industry certifications such as FIPS 140-2 fits well with HIPAA and HITECH Act guidelines on data protection. When you read about NIST recommendations for encryption and key management best practices, we are already there.

Software ISVs who serve the medical industry also need partner-friendly solutions. ISVs need more than just a technical solution. They need someone they can call on to explain data protect best practices, who can assist in the implementation of encryption and key management, and who can help them stay competitive in their markets. The last thing an ISV needs is to integrate some expensive technology into their solutions and then find themselves at a competitive disadvantage. I am proud of our partner program and its focus on making sure our partners are successful both in their technology initiatives, and in their businesses, too.

This will be our first year at the HIMSS conference in Las Vegas, but we are bringing a lot of experience in the medical segment to the show.  I hope you find the show interesting and helpful, and that you come by our booth (#14124).

Click me

Topics: HITECH, HIPAA, Trade Shows

IBM i Encryption: Buy Solution or Use Built-In Libraries?

Posted by Patrick Townsend on Jan 10, 2012 8:03:00 AM

AES enryptionI’ve been writing about encryption performance lately because our customers and potential customers have been asking about the impact of encryption on the overall performance on their systems.  It’s good that they are asking these questions as a poorly performing encryption library can have severe impact on your application environment. This is especially true on an IBM Enterprise platform like the IBM i (formerly known as AS/400 and iSeries) where customers often run multiple applications.

While it is common in the Microsoft, UNIX, and Linux worlds to segment different applications onto different physical servers, it is common in the IBM i world to run many applications on the same server. You typically find CRM, ERP, web, and many other applications happily co-existing on one IBM i server. But this means that a poorly performing encryption library will have a ripple impact on all of these applications, and not just one.

IBM provides a no-charge, AES software encryption library on the IBM i platform that developers can use to encrypt data. It implements all of the standard AES key sizes (128, 192, and 256) along with a variety of other encryption algorithms, both open and proprietary.  I don’t believe the software library has been independently certified to the NIST standards, but I believe that it properly implements the AES encryption algorithm.

But how does it perform?

Encryption PerformanceWe did a simple little comparison test of encrypting 1 million credit card numbers on an entry level IBM i model 515 server with a single processor. We compared the native IBM AES library with our own AES encryption library which is NIST certified and optimized for encryption.  The difference is very large. Our IBM i encryption library clocked in at 116 times faster than the native IBM i library. Note that this is an informal test and not independently verified, but practical experience by our customers is very similar.

What does this mean in terms of application performance when you add encryption to the mix? The math is pretty simple. An encryption task that takes 10 minutes with our library will take several hours with the IBM library. That’s painful. And all of the other applications that share this system will also feel the pain.

The problem is not limited to just an occasional developer at an individual customer site. Some vendors of IBM i software use the IBM encryption libraries, too. So you can be inadvertently using the poorly performing libraries without knowing it.

Often I see IBM i customers trying to fix an encryption performance problem by adding additional processors to their servers. This can be expensive, and usually involves software license upgrade fees. It can also not have the impact that you might think. Due to the way that encryption works, adding a second processor usually will not double your encryption throughput. Another bit of disappointment and extra cost.

It is usually not hard to fix an encryption performance problem if you catch it early. If you’ve take a modular approach to the implementation, you can usually swap out one module for another without too much difficulty. You just don’t want to be doing that for hundreds of applications.

For more information on AES encryption, download our white paper "AES Encryption and Related Concepts" and learn about how proper encryption and key management work together to secure your data.

Patrick

Click me

Topics: Encryption, IBM i, Performance

Data Protection: Hashes and Salting

Posted by Patrick Townsend on Dec 29, 2011 10:00:00 AM

SHA-256Periodically people ask me about hashes and why the use of a salt value with a hash is recommended. Let’s have a look at this topic in our last blog for 2011!

The use of a secure hashing algorithm is common in business applications. It has a variety of uses in the areas of authentication, data integrity, and tokenization. A hash method is sometimes called one-way encryption, but this is a bit of a misnomer.  It is true that you can’t reverse the result of a hash operation to recover the original value (thus it is one-way), but it is not formally an encryption method. This one-way property of hash methods is what makes them so useful. You don’t have to worry about sending a hash value across a network in the clear as it can’t be reversed. (At ease you crypto people, I know about the developing security concerns about SHA algorithms; more on that later).

While there are a number of hash algorithms available in the public domain, most security professionals recommend the use of the SHA-2 family of routines. I find that most people now use the SHA-256 algorithm when they want to create a one-way hash of some data, although the more secure SHA-512 method is being used more frequently. Older methods such as MD5 and proprietary hash methods should not be used in modern applications due to security concerns.  With SHA-256 and SHA-512 we have a really good method for doing one-way hashes.

So why do some security professionals recommend the use of a salt value with hashes, and what is salt?

The term salt refers to a one-off value that is difficult to guess. In practical application, a random number is generally used for a salt value. For the sake of this discussion, we will assume that a salt value is a random number.

By adding a salt value to some data before hashing it, you make it more difficult to guess the original value. Notice that I didn’t say you make it easier to reverse! For all practical purposes, you can’t reverse a hash value. But a clever attacker might guess at the original value and perform a dictionary or brute force attack on a hashed value. How can that be?

hashWell, take the example of your banking PIN code. It might be 4 or 5 digits in length. From the point of view of modern computers, that is a really small set of numbers to test against a SHA-256 algorithm. Only 9,999 values for a 4-digit banking PIN code. That is going to take less that a second to run through all of the possibilities. So this is where a salt value can come in handy. If you are creating a hash value of very small bit of data, you can append a salt value to the original data and make it really hard to attack that hash value. And that’s why using salt with your hashes is often a recommended security practice.

By the way, even though credit card numbers are only 16 digits in most cases, that is still a small number in computational terms. And once you account for BIN codes and LUHN check digits, credit card numbers are effectively smaller than 16 digits. This is why PCI and other regulations require or recommend the use of salt with hashes.

If you do use a salt value with a hash, you have to take care to protect the salt value from loss. You should take as much care about protecting the salt value as you take with encryption keys. If someone knows the salt value you’ve lost your advantage. Also, you should be sure to use a salt value that is large enough to provide good security. A 128-bit salt value is adequate for most business applications.

As I hinted at above, there have been some developments in attacks against the SHA-2 family of hash algorithms. I don’t think these attacks rise to the level of a practical concern in business applications, but the professional cryptographic community is hard at work on new hash methods. I think you should continue to use SHA-256 with confidence, but you should salt that hash for added protection!

Happy Holidays!

Patrick

Be sure to follow us on Facebook, Twitter, and LinkedIn to stay up to date on the latest technology and news about data protection.

facebook  Twitter  LinkedIn

Topics: security, Data Privacy, SHA-256

Stalled - Encryption of Data at Rest

Posted by Patrick Townsend on Dec 13, 2011 7:35:00 AM

encryption key managementA number of studies show that only about 25 percent of companies and organizations have deployed encryption of data at rest to meet privacy regulations, and we seem to be stalled at about that level.  We are now about 10 years past the really big data losses that led to the emphasis on protecting data, why are we making so little progress?

I think one of the main reasons is the level of difficulty in deploying most data encryption solutions. Most organizations still see an encryption project as requiring lots of time, money, and human resources to accomplish. As humans I think we all have a tendency to avoid the hard and painful things we know we need to do (I plead guilty). And this is an impediment to getting our data protected with the right encryption and key management technologies.

Vendors of data protection technologies have been slow to address this part of the equation. We have our heads in the technical side of things trying to be sure that we implement secure solutions that meet best practices, and working towards the difficult product certifications that we have to accomplish. The user experience is not usually the thing most on our minds. So, I think we’ve been a part of the problem.

It is also true that developers who are good at the user experience are generally lousy at security. You just don’t go about security development in the same way you mash up a new web service.  Most of the new web-based security solutions that promise to make things so much easier look from the outside really terrible in terms of encryption, key management, and interface security.

It is up to those of us who make security solutions to make them easier to use. Here at Townsend Security we are trying to channel Steve Job’s focus on the user experience.  Once you have the foundational security applications done and certified, it is time to look at how to make them easier to use. This year we implemented our SQL Server EKM encryption key management solution that makes it easy to secure Microsoft data. We also introduced IBM i FIELDPROC automatic encryption which is making data protection a lot easier for AS/400 customers.  I am convinced we are on the right track in this regard, and you will find us trying to make other environments easier to secure as we go forward.

Best wishes for the New Year!

Patrick Townsend

Learn how we have made encryption key management easier and more affordable than ever with Alliance Key Manager.

Click me

Topics: Encryption, Encryption Key Management

FIELDPROC – One Place Encryption Performance Really Matters

Posted by Patrick Townsend on Nov 21, 2011 11:00:00 AM

FIELDPROC encryptionIBM introduced FIELDPROC (Field Procedures) in V7R1 of the IBM i (AS/400, iSeries) operating system to provide for an automatic method of implementing encryption at the column level. While new to the IBM i platform, FIELDPROC is not actually a new technology. It was first implemented on the IBM System z mainframe platform about 20 years ago. But it is new to the IBM i and is now starting to get a lot of attention as customers start the upgrade process to V7R1.

The attraction of FIELDPROC is that it gives you a way to implement AES encryption on the IBM i without changing your application code. As long as you have an application that can perform key retrieval and encryption (IBM does not supply this) you are ready to implement FIELDPROC. 

But you should be aware of the one really big impact of FIELDPROC on your application – performance. A FIELDPROC program is called dynamically from the DB2 database engine. That is, it is not statically bound to the database, and it is not incorporated as a service program (dynamic ally linked library). The dynamic nature of the FIELDPROC invocation added on top of the encryption CPU load can lead to really bad surprises when you roll into production.

Before you deploy your own or your vendor’s FIELDPROC code, do some simple tests. I suggest that you do these simple tests on a database of 1 million records:

  • Start FIELDPROC to place the entire table under encryption control.
  • Read the entire database to force a decryption on every record.
  • Update the encrypted field in every record to force a decryption and encryption for every record.

If you have multiple fields in a table under FIELDPROC control, you will want to do additional performance tests as well. If you encrypt 20 fields in the table, what will happen when FIELDPROC gets called 20 times with every database read?

We are a vendor of a FIELDPROC solution and I will share some results with you from one of our in-house systems. To line up with compliance regulations and encryption best practices, we used our FIPS-140-2 certified encryption key management appliance and our NIST certified AES encryption library. These results are not independently verified, but you can you can download the tests and try them on your system (always a good idea).

The Platform:

An entry level 9407 model 515 with a single POWER5+ processor, 1 Gigabyte of memory, two 70-Gigabyte model 4327 disk drives (no RAID), and a CPW rating of 3800. The latest V7R1 cumulative PTFs are installed. This is the slowest thing we have in the house.

The Database:

A simple, uniquely keyed DB2 database created with DDS and containing 5 character fields and one packed numeric field. One of the non-keyed character fields is encrypted with FIELDPROC. The file contains 1 million records.

Encryption Key Management:

Our FIPS-140-2 NIST certified Alliance Key Manager encryption key server installed on the local network. Our FIELDPROC application will automatically and securely retrieve the encryption key when needed.

Encryption Library:

Our NIST certified, optimized, 256-bit AES encryption software library.

The Application Environment:

No other applications running on the system at the same time; the system is in normal state (not dedicated); all applications are OPM model with no optimization; tests are run in batch.

The Results:

Start FIELDPROC to place the database under initial protection:

Elapsed time:  68 seconds
Records per second:  14,705
Application CPU:  34.33

Read all records to force a decryption:

Elapsed time:  62 seconds
Records per second:  16,129
Application CPU:  37.43

Update all records to force a decryption, an encryption, then an update:

Elapsed time:  88 seconds
Records per second:  11,363
Application CPU:  81.26

aes encryption performanceI think this is a pretty good baseline of minimum performance our customers will see with our FIELDPROC solution. Most of our customers run with the more modern POWER6 or POWER7 processors which bring a lot more CPW power to the task (a new entry level POWER7 process has 10 times the CPW rating). More and faster disk drives and more memory will definitely help performance. So you should see substantially better performance in real-world environments.

I hope this provides some helpful guidelines for your FIELDPROC project.  Download an evaluation copy of our Alliance AES Encryption for FIELDPROC to see for yourself just how easy you can be protecting your sensitive data.

Click me

Topics: Encryption, IBM i, FIELDPROC

Encrypted PDF & ZIP with Managed File Transfer

Posted by Patrick Townsend on Nov 4, 2011 8:22:00 AM

Encrypted ZipIBM i (AS/400, iSeries) users send a lot of sensitive information to their customers, vendors, and employees which needs to be protected with strong encryption.  Our customers today are using our PGP encryption solution to protect files. But there has been a big need to generate and protect information in common PC formats. With our managed file transfer solution, Alliance FTP Manager for IBM i, we stepped up our support with encrypted Zip files and encrypted PDF files.

Zip compression is very commonly used to send files via email. Not only does Zip compression make our email attachments smaller, but the most popular Zip compression programs now support 256-bit AES encryption of the contents. The ability to encrypt Zip files with AES provides a much better level of security than older Zip protection methods.  Alliance FTP File Manager for IBM i fully supports Zip encryption to the WinZip standard. This means that you can create and protect Zip files on your IBM i platform, and then use a variety of delivery methods to get the Zip files in the hands of your customers, vendors, and employees. This functionality gives IBM i customers a powerful tool to meet compliance regulations.

Encrypted Zip support in Alliance FTP Manager provides rich capabilities to IBM i users. You can create encrypted or un-encrypted Zip archives, include sub-directories, and use wild cards to select files.  When uncompressing and decrypting, you can specify any directory as the target for the files. This capability integrates with our automation facilities for processing received files. Lastly, we provide a Windows command line Zip application to help our customers who don’t already have a Zip application.  I’m confident that this capability will help customers achieve a better level of security.

Another security technology in FTP Manager for IBM i is our encrypted PDF support. In this implementation, our customers are able to create encrypted PDFs with their own content, and then use the automation facilities to distribute the PDFs via email, FTP, and other distribution methods. Encrypted PDF support includes the ability to set fonts and colors, embed watermark and graphic images, set headers and footers, and create tables and lists. The resulting encrypted PDF file is compatible with any PDF reader that supports the AES encryption standard for PDF. We’ve tested with a wide variety of PDF readers on PCs, Apple Macs, Blackberry, Linux desktops, and so forth. This gives our customers an additional tool to secure their sensitive data.

These technologies for the IBM i customer increases their abilities to meet compliance regulations and secure sensitive data. I hope you get the idea that we are dedicated to helping you protect your sensitive data and corporate assets. You are going to see a lot more of these types of capabilities as we go forward.  For more information on our managed file transfer solution, view our webcast "Secure Managed File Transfers on the IBM i."


Click me

Topics: Alliance FTP Manager, Managed File Transfer, Secure Managed File Transfer, ZIP, FTP Manager for IBM i, secure communications, Webinar

IBM i FIELDPROC Surprises

Posted by Patrick Townsend on Nov 1, 2011 8:12:00 AM

IBM i automatic encryptionIn V7R1 of the IBM i (iSeries, AS/400) operating system, IBM introduced support for automatic, column-level encryption in the DB2 database call FIELDPROC (short for “Field Procedure”). For customers who are familiar with other automatic database encryption implementations such as Microsoft SQL Server Extensible Key Management (EKM) and Oracle Transparent Data Encryption (TDE), the new DB2 database implementation can be confusing. The encryption implementation in DB2 is quite different from other vendor implementations. Here are a few highlights of those differences:

FIELDPROC is not encryption, it is an exit point. What IBM is providing with FIELDPROC is the opportunity for you, the customer, to implement encryption at the column level. You have to enable the option, provide the IBM i AES encryption software, and do the implementation yourself. This is different from SQL Server and Oracle TDE where the database does the encryption for you.

Operations are on encrypted data. In SQL Server and in Oracle, the database is decrypted and then the SQL operations take place on the plaintext values. Not so in DB2 FIELDPROC. The necessary information is first encrypted, and the operation takes place using the encrypted value. This can lead to some surprises if you are not careful about your approach to AES encryption and Initialization Vectors.

Decryption might not be called on a read operation.  Some IBM i customers are surprised that FIELDPROC may not be called when doing a “read equal” type of operation. In SQL this can happen in a SELECT clause with a WHERE statement. In the RPG language this can be a CHAIN operation with a key value. The DB2 database will call the FIELDPROC application to encrypt the search value, but not call the FIELDPROC application if the search is satisfied. That will defeat your attempt to do data masking on decryption!

Database joins need special care. Database joins take place on the encrypted values, not the decrypted values. This means that identical values in different tables need to have the same encrypted value. This runs counter to normal encryption thinking in database tables.

FIELDPROC applications are dynamic calls.  The FIELDPROC applications that you or your vendor creates are going to be called dynamically from the database engine. This means that when you develop the FIELDPROC application you have to take special care that they perform exceptionally well, and that your encryption library is optimized (see next item).

Not all AES encryption libraries are equal. There are big performance differences between AES encryption libraries and it can mean a really big difference to your FIELDPROC application performance. We’ve noted before that AES encryption library performance can vary by a factor of 116. That difference can mean a batch job that takes 10 hours or 10 minutes. Be careful!

For further information view our webinar "Automatic Encryption on IBM i V7R1" and learn how Automatic Encryption is now possible on IBM V7R1 with AES/400.

Patrick

Click me

Topics: IBM i, automatic encryption, V7R1, AES Encryption

Ouch! – I Guess Encryption Standards Actually Do Matter

Posted by Patrick Townsend on Oct 25, 2011 8:17:00 AM

DOWNLOAD WHITE PAPER

PCI Compliance White Paper

Download the white paper "Achieve Safe-Harbor Status from HITECH Act Breach Notification" to learn more about encyption and key management best practices.

Click Here to Download Now

The recent news of SAIC being dinged for not protecting US military TRICARE medical information with standard AES encryption and suffering a data loss is interesting. While the details are still thin, it appears that the data was encrypted, but not with a standard AES encryption method. The HITECH Act proposed data security rules make specific reference to AES and other NIST standards.

We don’t know which encryption method was used to protect the data. It could have been a home grown method of encryption, or it may have been a widely accepted encryption method that was just not a part of NIST standards. But it apparently doesn’t matter. If you are not using a NIST standard method of encryption, you are in violation of the compliance requirements.

I think it is going to take some time for the implications of this to settle in. Here are some rather unorganized thoughts:

Over the last two years I’ve seen at least FOUR instances of vendor “AES” encryption solutions that actually weren’t AES encryption. In one case, a point-of-sale vendor implemented an AES encryption library with a 256-bit AES block size. The AES standard (FIPS-197) only allows the use of a 128-bit block size.  The company running this software had no idea that they weren’t actually running an industry standard method of encryption.

In another case a customer was running AES encryption with a non-approved mode of encryption. The underlying encryption library was AES, but the mode was not a NIST-approved mode of operation. This was a distinction lost on the company running this “AES” solution. But it seems likely to me that they were out of compliance and at risk in the same way SAIC was. This company is going to have to rip out the current solution and replace it with something that is actually compliant. That seems like such a waste of time and resources.

In one of these cases the software was provided by a “security” vendor. This vendor sells encryption and key management software specifically to meet encryption compliance regulations. That’s very sad.

With the best of intentions and with deep knowledge of encryption protocols, you can still make mistakes when developing an encryption solution. It is hard to get this right. And weak vendors without the commitment and passion to get it right represent a risk to everyone. So, if you are a vendor of encryption solutions, what do you do to insure that you are getting things right? You learn to not trust yourself so much, you invest in independent review of your solutions, and you invest in independent certification. Today we would never release an encryption product without subjecting it to NIST certification and independent review.

If you are a company facing an encryption project, how will you select a security vendor for your encryption libraries and encryption key management solution? How will you know that their AES encryption is really based on the NIST standard? Are you ready to trust the claims of a sales person? I wouldn’t, and I don’t think you should, either. If a security vendor can’t show you a formal NIST AES Validation certificate, or a FIPS-140-2 certification, you should run for the nearest exit. You just have way too much to lose.

If you think that the HITECH Act is unique in its reference to NIST standards, have a look at the proposed Federal Privacy Law (Senate Bill 1151) that passed out of the Senate Judiciary committee last week. It is likely to empower the FTC to propose standards for encryption and encryption key management, and the FTC is likely to look to NIST for these standards.

The writing is on the wall, or rather, it’s on the Internet at www.nist.gov.

Learn more about proper encryption and key management best practices for HIPAA and HITECH Act in our white paper titled "Achieve Safe-Harbor Status from HITECH Act Breach Notification".

Patrick

Click me

 

 

 

 

Topics: Encryption, NIST, HITECH, HIPAA, AES

PCI DSS Losing Ground?

Posted by Patrick Townsend on Oct 5, 2011 3:34:00 PM

verizon compliance reportThe recent 2011 PCI Compliance Report released by Verizon concludes that many companies are losing ground on PCI DSS compliance and 44% of all breaches take over a year to be discovered. These findings are disturbing. eWeek.com wrote an excellent summary of the report.

Here is one snippet from the article:

"About 42 percent of organizations had trouble encrypting data in the database or implementing a proper key management strategy to keep the information safe."

We know that data protection is the hardest part of PCI DSS compliance. Many studies show that organizations struggle with encryption and key management. But are they really losing ground after they get their data in place?

I talk to a lot of customer about PCI DSS compliance, and I have a different take on this.

The recent audit and training changes that affect Level 2 Merchants may be showing up in this statistic. Prior to 2011, Level 2 Merchants completed an annual Self Assessment Questionnaire (SAQ). Starting in 2011 Level 2 Merchants must either undergo an on-site audit by a QSA auditor, or send a member of their IT team for ISA training by the PCI council. A lot of companies are opting for the second option and are getting their internal staff through the ISA training process.

I think that a lot of these newly trained IT professionals are coming back home and understanding encryption and key management requirements a lot better. It was easy to put the check marks in the box when doing the SAQ questionnaire. Now there is a lot more thought about what good encryption and key management means. I think that is driving a lot of the change, especially in the area of key management.

Did these companies lose ground? No, they weren’t in compliance before, and they are just coming into compliance now.

Customers tell me that meeting the PCI DSS requirements for key management is their biggest area of remediation. They’ve been storing encryption keys in a file, or somewhere on the hard drive, or on a USB storage device, or on another server where they are not properly protected. None of these techniques can meet PCI DSS requirements for Dual Control, Separation of Duties, and Split Knowledge. Really, any storage of data encryption keys on the same server as protected data is going to be a compliance problem. Newly trained IT staff now understand this and are taking action to fix the problem.

So, did they fall out of compliance? No, they weren’t in compliance before and now they are moving towards better security. And that is a good thing.

I don’t mean to minimize the effort that it takes to stay in compliance with PCI DSS. It’s a lot of work and it takes on-going attention. And security and IT departments are under the same budgetary pressures that all of us feel. They are trying to make do with fewer people and smaller budgets. 

But perhaps the news is not as bad as we think. If you haven’t taken a look at your key management strategy lately, now is the time to do it.

Fore more information, download our podcast "Key Management Best Practices: What New PCI Regulations Say" and learn about encryption key management best practices, as well as what PCI has to say about integrated key management (why it isn't a good thing), dual control, separation of duties, and split knowledge.

 

Patrick

Click me

Topics: Compliance, PCI DSS, Encryption Key Management

Federal Data Privacy Law Advances in Senate Bill 1151

Posted by Patrick Townsend on Sep 29, 2011 10:35:00 AM

Federal Privacy Law 1151Draft versions of a Federal data privacy and breach notification law have been in existence for over a year. The House of Representatives passed a version some months ago, and two versions have been working their way through the US Senate. This week saw a significant advance in the US Senate as the judiciary committee under Senator Patrick Leahy’s leadership passed a version out of committee with a vote along party lines. I think Senate Bill 1151 represents a significant step forward towards a federal law that will replace all of the approximately 45 state laws on breach notification. The law still has to be reconciled with the House version, and a lot can change in the process, but there is general agreement in the business community that one Federal law is preferable to a lot of different state laws. So I think there is a good chance that a Federal privacy law can pass.

Here is a recap of some of the features of the new law that will affect your business:

  • You will need to have a written security policy.
  • You will need to perform periodic vulnerability assessments.
  • You will need to protect data using industry standard practices such as encryption.
  • The legal penalties include fines and imprisonment.
  • If you share sensitive data with service providers, you must ensure that they protect the data.
  • You are responsible for notifying people affected by the data loss.
  • There is an expanded definition of “Sensitive Personally Identifiable Information”.
  • You will need to maintain audit trails of who accessed sensitive information.

In many ways, the new federal law goes further than most state laws in defining what companies must do to protect sensitive data. The law tries to strike a balance between prescriptive measures, and the evolving nature of threats. In many respects the law comes close to adopting many of the principles of the Payment Card Industry Data Security Standards (PCI DSS), and companies who meet PCI DSS standards will find a lot that is familiar in the law.

The definition of Personally Identifiable Information (PII) has expanded pretty dramatically and now includes telephone numbers and mobile device IDs, email addresses, and other information. I will talk about this a bit more in future blogs. I think there are some substantial procedural and technology issues in this area that will affect your approach to protecting data.

As I expected, the Federal law makes reference to industry standards for encryption and key management, and points directly to existing laws such as Gramm-Leach-Bliley (GLBA), the Health Insurance Accountability and Portability Act (HIPAA), and others. The Federal Trade Commission is charged with developing guidelines in this area. I think there is a well-worn template for this type of work that will point directly to the NIST standards and best practices. I believe that companies would do well to be sure that their data protection strategies line up with NIST standards.  FIPS-140-2 certification is already required of some private enterprises, and this is probably the direction we are going.

Be sure to follow us on Facebook, Twitter, and LinkedIn to stay up to date on the latest technology and news about data protection.

facebook  twitter  linkedin

Topics: privacy laws, Data Privacy