+1.800.357.1019

+1.800.357.1019

Feel free to call us toll free at +1.800.357.1019.

If you are in the area you can reach us at +1.360.359.4400.

Standard support
6:30am - 4:00pm PST, Monday - Friday, Free

Premium support
If you own Townsend Security 24x7 support and
have a production down issue outside normal
business hours, please call +1.800.349.0711
and the on-call person will be notified.

International customers, please dial +1.757.278.1926.

Townsend Security Data Privacy Blog

The Business Case for Tokenization

Posted by Luke Probasco on Feb 28, 2012 11:44:00 AM

White Paper: Business Case for Tokenization

Business Case Tokenization

Download the white paper "The Business Case for Tokenization"

Click Here to Download Now

Tokenization is a technology that helps reduce the chance of losing sensitive data – credit card numbers, social security numbers, banking information, and other types of Personally Identifiable Information (PII). Tokenization accomplishes this by replacing a real value with a made-up value that has the same characteristics.  The made up value, or “token”, has no relationship with the original person and thus has no value if it is lost to data thieves.  As long as a token cannot be used to recover the original value, it works well to protect sensitive data.

Tokenization in Development and QA Environments

Tokenization is an excellent method of providing developers and testers with data that meets their requirements for data format and consistency, without exposing real information to loss.  Real values are replaced with tokens before being moved to a development system, and the relationships between databases are maintained.  Unlike encryption, tokens will maintain the data types and lengths required by the database applications.  For example, a real credit card number might be replaced with a token with the value 7132498712980140.  The token will have the same length and characteristics of the original value, and that value will be the same in every table.  By tokenizing development and QA data you remove the risk of loss from these systems, and remove suspicion from your development and QA teams in the event of a data loss.

Tokenization for Historical Data

In many companies, sensitive data is stored in production databases where it is actually not needed.  For example, we tend to keep historical information so that we can analyze trends and understand our business better.  Tokenizing sensitive data, in this case, provides a real reduction of the risk of loss.  In many cases it may take an entire server or database application out of scope for compliance regulations.  In one large US company the use of tokenization removed over 80 percent of the servers and business applications from compliance review.  This reduced the risk of data loss and it greatly reduced the cost of compliance audits.

Download our white paper “The Business Case for Tokenization: Reducing the Risk of Data Loss” to see how tokenization is helping organizations meet their business goals without exposing their sensitive data to loss.


Click me

Topics: Data Privacy, tokenization

The Definitive Guide to AWS Encryption Key Management
 
Definitive Guide to VMware Encryption & Key Management
 

 

Subscribe to Email Updates

Recent Posts

Posts by Topic

see all