Townsend Security Data Privacy Blog

How Tokenization Can Help Your Business

Posted by Luke Probasco on Mar 8, 2012 7:50:00 AM

White Paper: Business Case for Tokenization

Business Case Tokenization

Download the white paper "The Business Case for Tokenization"

Click Here to Download Now

Tokenizing sensitive data delivers an outstanding return on investment (ROI) to businesses by providing a risk-reduction of losing sensitive data.  By tokenizing data, organizations can reduce the chance of losing sensitive information – credit card numbers, social security numbers, banking information, and other types of PII.  In some cases tokenization can take an entire server or database application out of scope for compliance regulations.  This blog will discuss how tokenization can reduce risks in customer service departments, with outside services, and in BI and Query environments.

Tokenization for Customer Service

Tokenization can reduce risk in the customer service department by removing sensitive data from customer service databases.  For out-sourced operations you should tokenize data before sending it to the outside service.  A customer service worker can still accept real information on the phone from an end customer, but there is no need to store the actual information in a database that can be stolen.  Tokenization services will associate real information with tokenized information for data retrieval.  While using tokenization in a customer service environment can’t completely remove the risk of data loss, but it can dramatically reduce the amount of data at risk and help you identify potential problems.

Tokenization for Outside Services

Many retail companies send their Point-Of-Sale transaction information to analytics service providers for trend and business analysis.  The service provider identifies trends, spots potential problems with supply chains, and helps evaluate the effectiveness of promotions.  In some cases, service providers consolidate information from a large number of companies to provide global trends and analysis.  You can avoid the risk of data loss by replacing the sensitive data (names, credit card numbers, addresses, etc.) with tokens before sending the data to the service provider.

Tokenization for Business Intelligence and Query

Many IT departments help their business users analyze data by providing them with business intelligence (BI), query reporting tools, and databases of historical information. These tools and databases have empowered end-users to create their own reports, analyze business trends, and take more responsibility for the business.  This practice has decreased workloads and increased efficiency in IT departments.

Unfortunately, these tools and databases open a new point of loss for sensitive information.  A database with years of historical information about customers, suppliers, or employees is a high value target for data thieves.  Criminals aggregate this type of information to provide a complete profile of an individual, making it easier to steal their identity.  When tokens replace names, addresses, and social security numbers, this makes the BI database unusable for identity theft, while maintaining the relational integrity of the data.  Tokenizing business intelligence data is an easy win to reduce your risk of exposure.

Download our white paper “The Business Case for Tokenization: Reducing the Risk of Data Loss” to see how tokenization is helping organizations meet their business goals without exposing their sensitive data to loss. 

Click me

Topics: Data Privacy, tokenization

The Business Case for Tokenization

Posted by Luke Probasco on Feb 28, 2012 11:44:00 AM

White Paper: Business Case for Tokenization

Business Case Tokenization

Download the white paper "The Business Case for Tokenization"

Click Here to Download Now

Tokenization is a technology that helps reduce the chance of losing sensitive data – credit card numbers, social security numbers, banking information, and other types of Personally Identifiable Information (PII). Tokenization accomplishes this by replacing a real value with a made-up value that has the same characteristics.  The made up value, or “token”, has no relationship with the original person and thus has no value if it is lost to data thieves.  As long as a token cannot be used to recover the original value, it works well to protect sensitive data.

Tokenization in Development and QA Environments

Tokenization is an excellent method of providing developers and testers with data that meets their requirements for data format and consistency, without exposing real information to loss.  Real values are replaced with tokens before being moved to a development system, and the relationships between databases are maintained.  Unlike encryption, tokens will maintain the data types and lengths required by the database applications.  For example, a real credit card number might be replaced with a token with the value 7132498712980140.  The token will have the same length and characteristics of the original value, and that value will be the same in every table.  By tokenizing development and QA data you remove the risk of loss from these systems, and remove suspicion from your development and QA teams in the event of a data loss.

Tokenization for Historical Data

In many companies, sensitive data is stored in production databases where it is actually not needed.  For example, we tend to keep historical information so that we can analyze trends and understand our business better.  Tokenizing sensitive data, in this case, provides a real reduction of the risk of loss.  In many cases it may take an entire server or database application out of scope for compliance regulations.  In one large US company the use of tokenization removed over 80 percent of the servers and business applications from compliance review.  This reduced the risk of data loss and it greatly reduced the cost of compliance audits.

Download our white paper “The Business Case for Tokenization: Reducing the Risk of Data Loss” to see how tokenization is helping organizations meet their business goals without exposing their sensitive data to loss.


Click me

Topics: Data Privacy, tokenization

Token Generation: New PCI SSC Tokenization Guidance

Posted by Patrick Townsend on Aug 16, 2011 12:56:00 PM

tokenization pciThe generation of tokens is an important part of the new PCI SSC guidance on tokenization. Tokens can be generated using a number of techniques including random number generation, encryption, sequential assignment, and indexes. Using tokens to recover the original credit card number must be “computationally infeasible” according to the new guidance.

In this area, I think the tokenization guidance could be much stronger. To the cryptographic community it is well understood how encryption and proper random number generation can produce results that can’t be feasibly reversed using computational methods. The use of indexes and sequential assignment is a lot more “iffy” in my mind. Here is an example: I have a table sorted by the credit card number that contains 100,000 rows. I read the table and assign tokens in a sequential manner.  Are these tokens as strong as tokens generated by a random number generator? Obviously not. Merchants and QSA auditors need to be very careful about a tokenization solution that uses indexes or sequence numbers for token assignment.

The use of encryption to generate tokens should also give merchants some pause for extra thought. First, using encryption may produce tokens that are indistinguishable from normal credit card numbers. The guidance discusses distinguishability of tokens, and if you use encryption to generate tokens you are probably going to have additional work in this area. Secondly, it is not clear yet if tokens generated in this manner will be subject to additional encryption and PCI SSC guidance. The work released today holds open the likelihood that there will be additional guidance for these types of tokens. In other words, the jury is still out on the use of tokens generated by encryption. Lastly, tokens generated by an encryption method almost certainly have to meet all of the PCI DSS requirements for encrypted PAN. It is hard to imagine any other outcome. Merchants will want to be extra careful when deploying a solution that uses encryption to generate tokens.

Lastly, we can expect to see an increase in the number of cloud based tokenization solutions coming to market. The new guidance only touches on this in a tangential way when it discusses the need to carefully review all aspects of a tokenization solution. Since most clouds are based on virtualized environments, I suggest that you read the PCI SSC virtualization guidance from the PCI SSC. It is very hard to see how any of the most popular cloud environments can meet the recommendations of that guidance.

Big kudos to Troy Leach and the members of the tokenization SIG and Working Group who’ve been laboring on this document! The council and advisory committees also deserve praise in nurturing this process. The stakeholders are a diverse community with sometimes conflicting interests. The result speaks well to the management team. I know that we’ll be seeing more work from this group in the future.

Click here for more information on Alliance Token Manager, our tokenization solution and learn how we can help your organization with a certified and comprehensive tokenization technology.

-Patrick

Click me

Topics: tokenization, PCI, PCI SSC

PCI SSC Issues New Tokenization Guidance

Posted by Patrick Townsend on Aug 12, 2011 12:39:00 PM

pci tokenizationToday the Payment Card Industry Security Standards Council issued some long awaited guidance on tokenization as a method of protecting credit card information (“PAN”, or Primary Account Number in PCI language). This guidance has taken a number of months to produce, and will be very helpful to merchants who want to deploy tokenization solutions to reduce the scope of PCI compliance efforts, for those who have already deployed solutions and need to assess if they meet minimum requirements, and for QSA auditors who have been looking for some guidance in this area.

The link to the guidance is here.

Here are some first thoughts on reading the guidance.

As I’ve been saying here before, a tokenization solution is always in scope for PCI DSS compliance. This means the merchant is responsible for insuring that a tokenization solution itself meets all of the PCI DSS requirements. Whether you create your own tokenization application, or you acquire one from a vendor, it is in scope for PCI DSS compliance and you are responsible for it. The guidance makes this point very clearly. In fact, the guidance makes the observation that tokenization solutions are likely to be the targets of attack because they become a centralized repository of credit card information. Merchants should be very careful that their tokenization solutions meet all PCI DSS requirements and are sufficiently hardened. In my opinion, there are very few merchants who would be up to the task of creating such a solution in-house.


Following on the previous point, a tokenization solution must have encryption and key management capabilities that meet PCI DSS requirements. This means special care in the deployment and implementation of key management. You should be very sure that your key management solution implements proper Dual Control, Separation of Duties, Split Knowledge, and separation of keys from protected data. When acquiring a vendor solution for tokenization look for FIPS-140-2 certification of the key management system – that should be a minimum requirement to indicate the proper implementation of encryption and controls.

Download a recent webcast I recorded that discusses the importance of tokenization and PCI compliance.

view-tokenization-compliance-webinar

Today’s announcement by the PCI Security Standards Council on tokenization is certain to raise awareness about the importance of selecting the right tokenization solution for your organization and ensuring it meets the PCI DSS requirements.  I will be post another article Tuesday that will discuss more thoughts on this new guidance.

I hope you find this helpful.


Patrick

 

Topics: tokenization, PCI, PCI SSC

Tokenization & Encryption: The Data Protection Package

Posted by Patrick Townsend on Jul 12, 2011 8:00:00 AM
encryption and tokenizationAs I talk to Enterprise customers I’m finding a lot of confusion about when to use encryption or tokenization, and how to think about these two data protection technologies. Once you understand how each of these technologies work, you understand that there are no easy answers to which is best for you, or when one is better than another. I want to talk about some general guidelines I’ve developed to help with this conundrum.

Encryption is well known technology, has clear standards, and has been in use for data protection for a long time. Most of the compliance regulations (PCI, HIPAA/HITECH, state privacy regulations, etc.) make clear reference to encryption and widely accepted standards. So it is a no-brainer to use encryption. When done correctly, it is going to meet compliance regulations and your security goals.

But encryption has a nasty side-effect. When you encrypt fields in your database that are indexes or keys, you disrupt the indexing capability of the field and you introduce unacceptable performance burdens on your system. Encrypting an index or key field often means re-engineering the application and this is costly and time consuming.

Enter the new kid on the block – Tokenization.

When you tokenize data you replace the sensitive data with a surrogate, or token, value. The token itself is not sensitive data, but it maintains the characteristics of the original sensitive data. It walks like a duck, it quacks like a duck, but from a compliance point of view, it is NOT a duck. Tokenizing data lets you maintain those precious index and key relationships in your databases, and minimizes the number of changes you have to do to your applications.

So, why not use tokenization for everything? Is this the magic bullet we’ve been searching for?

Hold on there, Cowboy. There are some things you should think about.

Tokenization solutions typically work by creating a separate database to store the token and the relationship to the original sensitive data. This means that every time you need to register a new token, retrieve the attributes of the token, or recover the sensitive data, you have to make a request to the tokenization solution to do this work for you. Got 10 million records in your database? This is going to have a major impact on performance. Applications that need high performance may not be the best for a tokenization approach – you might really want to use encryption in this environment.

Then there is the question of compliance. Tokenization is new technology. At this point there are no standards for tokenization solutions, and no reference to tokenization in the published regulations. So, are you really compliant if you tokenize sensitive data? I think so, but you should be aware that this is an unsettled question.

When you tokenize data, you are creating a separate repository of information about the original sensitive data. In most cases you will probably be using a solution from a vendor. Since the tokenization solution contains sensitive data, it will itself be in scope for compliance. Has the vendor used encryption, key management, and secure communications that meet compliance regulations? How do you know? If you are going to deploy a tokenization solution you will want to see NIST certification of the solution’s encryption and key management so that you are not just relying on the claims of the vendor.

Most Enterprise customers will probably find uses for both encryption and tokenization. Encryption is great for those high-performance production applications. Tokenization is great for maintaining database relationships, and reducing risks in the development, test, QA and business intelligence databases. Both can help you protect your companies’ sensitive data!

For more information on tokenization, view our recorded webcast titled "Tokenization and Compliance - Five Ways to Reduce Costs and Increase Security."


Click me

Topics: Compliance, Encryption, tokenization

Five Ways to Protect Sensitive Data and Keep Your Database Compliant

Posted by Chris Sylvester on Jun 30, 2011 1:30:00 PM

eBook The Encryption Guide Companies of all sizes feel the increasing pressure to protect sensitive customer information to meet PCI-DSS Standards.  Here are five ways to help ensure your database meets PCI requirements: 

1) Use certified encryption solutions to protect cardholder data

A standards-based encryption solution safeguards information stored on databases. Encryption methods approved by the National Institute of Standards and Technology (NIST) provide assurance that your data is secured to the highest standards.  

2) Encrypt cardholder data that is sent across open, public networks
Transmit sensitive files over the internet using trusted encryption technologies. (AES, SSH, SSL, and PGP).

3) Store encryption keys from your encrypted data on a certified encryption key management appliance
The most important part of a data encryption strategy is the protection of the encryption keys you use. Encryption keys safeguard your encrypted data and represent the keys to the kingdom. If someone has access to your keys, they have access to your encrypted data.

4) Enforce dual controls and separation of duties for encrypted data and encryption keys
Make sure people who have access to your encrypted data are restricted from accessing the encryption keys and vice versa. If someone can access your encrypted data and access the keys, your data is compromised.  You shouldn’t lock your door and leave the key under the mat for easy access to your home, the same precautions should be taken with your sensitive data.

5) Use tokenization to take servers out of the scope of compliance
Tokenization replaces sensitive data with a token. The token maintains the original data characteristics but holds no value, reducing the risk associated sensitive data loss. When you store tokens on a separate token server it eliminates the need to store the original data in an encrypted format, and may take the server out of scope for compliance.

Download the whitepaper Meet the Challenges of PCI Compliance and learn more about protecting sensitive data to meet PCI compliance requirements.

The Encryption Guide eBook

 

Topics: Encryption, PCI DSS, Encryption Key Management, tokenization

Tokenization: A Cost-Effective Path to Data Protection

Posted by Luke Probasco on May 19, 2011 10:20:00 AM

tokenizationAs companies work to meet regulatory requirements to protect Personally Identifiable Information (PII), one option to minimize the risk of loss is to replace sensitive data with a non-sensitive replacement value, or “token.” 

Tokenization is the process of replacing sensitive information, such as a credit card or social security number, with a non-sensitive replacement value. The original value may be stored locally in a protected data warehouse, stored at a remote service provider, or not stored at all.  The goal of tokenization is to reduce or eliminate the risk of loss of sensitive data, and to avoid the expensive process of notification, loss re-imbursement, and legal action.

There are three primary approaches to tokenization:
    •  Tokens are recoverable and stored by external
       service providers
    •  Tokens are recoverable and stored locally
    •  Tokens are not recoverable

The first method of tokenization uses external storage of recoverable tokens and is implemented by a small number of credit card authorization networks.

The second approach to tokenization involves the creation and storage of the token on local IT servers. The token is protected by encryption and can be recovered by decryption when it is needed.

The third type of tokenization involves the creation of a token on local IT servers, but does not allow for the recovery of the original value.

If you do not need to store sensitive data in your database systems, tokenization can greatly reduce your risk of data loss. The original sensitive data can still be used to query a database or locate information in a business application. But by not storing the sensitive data, you will not be at risk of losing it.

It is important to note that if you use recoverable tokens you will still have the risk of data loss and will not be protected from any liability for a loss. You will also still be subject to all of the regulations  for protecting sensitive information.

Tokenization can be a powerful way to minimize risk in your development, QA, and UAT environments. When moving data to these environments you should always eliminate sensitive data to prevent its loss. Tokenization is an excellent way to do this.

Lastly, if you are a payment systems vendor you may wish to provide tokenization as a value added service to your merchant customers. Not only will you be helping them minimize their exposure to data loss, this can also be marketed as a competitive advantage for your business.

If you would like to learn more about tokenization, we recently presented a webinar titled "Tokenization & Compliance: 5 Ways to Reduce Costs and Increase Security."

Click me

Topics: Encryption, Data Privacy, tokenization

Encryption vs. Tokenization: Which is Best for Your Business?

Posted by Luke Probasco on May 10, 2011 7:42:00 AM

tokenizationEncryption and tokenization are the two leading technologies used to protect sensitive data from loss and subsequent breach notification and legal liability. Organizations who try to meet compliance regulations struggle to understand when to use strong encryption and when to use tokenization to protect information. Many organizations will find both technologies helpful in different places in their IT infrastructure.

Encryption protects data by obscuring it with the use of an approved encryption algorithm such as AES and a secret key. Once encrypted, the original value can only be recovered if you have the secret key. The use of strong encryption keys makes it impossible, from a practical point of view, to guess the key and recover the data. Almost all compliance regulations provide a safe harbor from breach notification if sensitive data is encrypted.

Encryption - Protecting Sensitive Data Where It Lives

Encryption is a mature technology with a recognized body of standards, independent certification of vendor technologies, and it undergoes continual scrutiny by the professional cryptographic community. Organizations that deploy professional encryption solutions that have been independently certified (NIST certification) enjoy a high level of confidence in the protection of their data assets.

Tokenization - Protecting Sensitive Data with Subsitution

Tokenization works by substituting a surrogate value for the original sensitive data. This surrogate value is called a “token”. The token value does not contain sensitive information, it replaces it, maintaining the original value.  There is one and only one token value for any given original value. For example, a credit card number 4111-1111-1111-1111 might be assigned the token value of 1823-5877-9043-1002. Once this token is assigned it will always be used when the original value would have been used.

Combining Encryption and Tokenization

For most organizations there will be appropriate uses for both encryption and tokenization. Encryption will be the right solution for one set of applications and databases, and tokenization will be the right solution for others.  The appropriate technology will be decided by each organization’s technical, compliance, and end-user staff working together.

In order to ease the development and compliance burden, organizations may wish to source encryption and tokenization solutions from the same vendor. There are many overlapping technologies in both encryption and tokenization, and you will probably want a common approach to both.

If you would like to learn more about tokenization, we recently presented a webinar titled "Tokenization & Compliance: 5 Ways to Reduce Costs and Increase Security." 

Click me

Topics: Compliance, Encryption, tokenization

Encryption, Key Management and Tokenization - The Trifecta

Posted by Patrick Townsend on Feb 11, 2011 11:47:00 AM

encryption, tokenizaiton, & key managementCompliance regulations are moving inexorably towards requiring the protection of sensitive data. The private information of customers, employees, patients, vendors and all of the people we come into contact with as Enterprises, must be protected from loss and misuse. At the same time that regulations are getting more teeth, there is more consensus about the technologies to protect data in our applications and databases. Encryption and tokenization are now the accepted methods of protecting data, and encryption key management is central to both technologies.

How fast are regulations changing? Really fast. The Payment Card Industry Security Standards Council will update the PCI Data Security Standard (PCI DSS) this year, and will be on a three year cycle of updates. Merchants accepting credit cards will have about 18 months to implement the changes. State privacy laws are undergoing frequent changes, most of which make the rules more stringent. Minnesota, Nevada, and Washington State have made recent changes. The HITECH Act of 2009 and related guidance further tightens the rules around protecting patient data, and further guidance is expected this year. Last, but not least, the federal government is moving new legislation through Congress to enact a national privacy law.

These changes are coming fast, and they have one thing in common: data protection requirements are getting stronger, not weaker. Companies and organizations should be paying attention to their data protection strategies now in order to avoid expensive rip-and-tear operations in the future.

One other tendency of the evolving regulations is this: A clear reference to standards for data protection. All of the mentioned standards now make reference to widely accepts standards, usually those of the National Institute of Standards and Technology (NIST) which publishes standards and testing protocols for encryption and key management. Over the last two years PCI (and related guidance from Visa), the HITECH Act, state privacy laws, and other regulations have specifically referenced NIST for data encryption and key management standards.

Companies and organizations acquiring data protection technologies should look carefully at how solutions match up to the standards. And a word of warning here: There is still a lot of snake oil in the data protection industry. Be sure that your data protection vendor can prove that their solutions actually meet the NIST standards. This is actually not hard to independently verify – NIST publishes on-line lists of vendors who certify their solutions to the standard.

Encryption is a well defined technology to protect data through the use of an encryption algorithm and secret key. When combined with proper key management, encryption provides a well accepted method of protecting sensitive data. There is a long history of work by professional cryptographers and NIST on defining how good encryption and key management should work, and you can easily determine which vendors meet the standard through the certification process.

Tokenization is a new technology and lacks the history and standards of encryption, but which incorporates encryption technologies. Tokenization works by substituting a surrogate value (or “token”) for the original data. By itself the token does not tell you anything about the original value which might be a credit card number, patient ID, and so forth. But tokenization requires that you use good encryption practices to protect the sensitive data in the token database. This means you have to use a tokenization solution that meets the same stringent standards for encryption and key management. When acquiring a tokenization solution, you will want to use the same diligence about encryption and key management that you would use for a pure encryption solution – that is, the solution should be built to standards and the vendor should be able to prove it through the NIST certification process.

Remember, a tokenization solution will be IN SCOPE for a PCI audit!

Tokenization standards are still evolving. Bob Russo of the PCI Security Standards Council indicated that the council will be taking up work on this in 2010. Visa just released a best practices guide for tokenization (you can get it here), and you can probably expect the eventual standards to incorporate much of this guidance. Additionally, the X9 organization is also working on standards for tokenization.

In regards to tokenization standards, stay tuned ! Much more is coming our way.

Encryption, tokenization, and key management – this is the trifecta for protecting data at rest. I’ll have more comments in the future about tokenization as we analyze the best practice guidance from Visa and help you connect the dots with our encryption, tokenization, and key management solutions.

Patrick

Topics: Encryption, NIST, Key Management, PCI DSS, tokenization