Download the white paper "The Business Case for Tokenization"
Tokenization is a technology that helps reduce the chance of losing sensitive data – credit card numbers, social security numbers, banking information, and other types of Personally Identifiable Information (PII). Tokenization accomplishes this by replacing a real value with a made-up value that has the same characteristics. The made up value, or “token”, has no relationship with the original person and thus has no value if it is lost to data thieves. As long as a token cannot be used to recover the original value, it works well to protect sensitive data.
Tokenization in Development and QA Environments
Tokenization is an excellent method of providing developers and testers with data that meets their requirements for data format and consistency, without exposing real information to loss. Real values are replaced with tokens before being moved to a development system, and the relationships between databases are maintained. Unlike encryption, tokens will maintain the data types and lengths required by the database applications. For example, a real credit card number might be replaced with a token with the value 7132498712980140. The token will have the same length and characteristics of the original value, and that value will be the same in every table. By tokenizing development and QA data you remove the risk of loss from these systems, and remove suspicion from your development and QA teams in the event of a data loss.
Tokenization for Historical Data
In many companies, sensitive data is stored in production databases where it is actually not needed. For example, we tend to keep historical information so that we can analyze trends and understand our business better. Tokenizing sensitive data, in this case, provides a real reduction of the risk of loss. In many cases it may take an entire server or database application out of scope for compliance regulations. In one large US company the use of tokenization removed over 80 percent of the servers and business applications from compliance review. This reduced the risk of data loss and it greatly reduced the cost of compliance audits.
Download our white paper “The Business Case for Tokenization: Reducing the Risk of Data Loss” to see how tokenization is helping organizations meet their business goals without exposing their sensitive data to loss.