The generation of tokens is an important part of the new PCI SSC guidance on tokenization. Tokens can be generated using a number of techniques including random number generation, encryption, sequential assignment, and indexes. Using tokens to recover the original credit card number must be “computationally infeasible” according to the new guidance.
In this area, I think the tokenization guidance could be much stronger. To the cryptographic community it is well understood how encryption and proper random number generation can produce results that can’t be feasibly reversed using computational methods. The use of indexes and sequential assignment is a lot more “iffy” in my mind. Here is an example: I have a table sorted by the credit card number that contains 100,000 rows. I read the table and assign tokens in a sequential manner. Are these tokens as strong as tokens generated by a random number generator? Obviously not. Merchants and QSA auditors need to be very careful about a tokenization solution that uses indexes or sequence numbers for token assignment.
The use of encryption to generate tokens should also give merchants some pause for extra thought. First, using encryption may produce tokens that are indistinguishable from normal credit card numbers. The guidance discusses distinguishability of tokens, and if you use encryption to generate tokens you are probably going to have additional work in this area. Secondly, it is not clear yet if tokens generated in this manner will be subject to additional encryption and PCI SSC guidance. The work released today holds open the likelihood that there will be additional guidance for these types of tokens. In other words, the jury is still out on the use of tokens generated by encryption. Lastly, tokens generated by an encryption method almost certainly have to meet all of the PCI DSS requirements for encrypted PAN. It is hard to imagine any other outcome. Merchants will want to be extra careful when deploying a solution that uses encryption to generate tokens.
Lastly, we can expect to see an increase in the number of cloud based tokenization solutions coming to market. The new guidance only touches on this in a tangential way when it discusses the need to carefully review all aspects of a tokenization solution. Since most clouds are based on virtualized environments, I suggest that you read the PCI SSC virtualization guidance from the PCI SSC. It is very hard to see how any of the most popular cloud environments can meet the recommendations of that guidance.
Big kudos to Troy Leach and the members of the tokenization SIG and Working Group who’ve been laboring on this document! The council and advisory committees also deserve praise in nurturing this process. The stakeholders are a diverse community with sometimes conflicting interests. The result speaks well to the management team. I know that we’ll be seeing more work from this group in the future.