+1.800.357.1019

+1.800.357.1019

Feel free to call us toll free at +1.800.357.1019.

If you are in the area you can reach us at +1.360.359.4400.

Standard support
6:30am - 4:00pm PST, Monday - Friday, Free

Premium support
If you own Townsend Security 24x7 support and
have a production down issue outside normal
business hours, please call +1.800.349.0711
and the on-call person will be notified.

International customers, please dial +1.757.278.1926.

Townsend Security Data Privacy Blog

MongoDB Encryption Key Rotation

Posted by Patrick Townsend on Jul 9, 2018 11:11:00 AM

Every now and then you get something wrong and have to eat crow. Well, it’s my turn. Caw!

Encryption & Key Management for MongoDB eBookAt the MongoDB World 2018 conference I gave the security session on encryption key management for MongoDB. I love doing this session because MongoDB really makes getting encryption key management right by exposing a Key Management Interoperability Protocol (KMIP) interface available to plug in a key manager. I talked about the MongoDB interface, about the principles of good encryption and key management, and then demonstrated how easy it is to deploy our Alliance Key Manager with MongoDB. It takes just a few minutes for a complete implementation.

There were really good questions at the end of the session. One of those questions was:

“How do you rotate the encryption keys for MongoDB?” 

Key rotation is, of course, required under the PCI Data Security Standard (PCI DSS) and is a good security practice. The National Institute of Standards and Technology (NIST) addresses this in their Key Management Best Practices document Special Publication 800-57. See the section on crypto-periods for data encryption keys.

This is where I ran off the rails. It was my impression that you had to create a new MongoDB database, encrypt it with a new key, and then copy the data into the new database. And that was not right! That is the procedure for some databases, but not MongoDB.

Luckily Davi Ottenheimer, MongoDB’s head of security, and a MongoDB engineer were right there to set me straight.

Fortunately you do NOT have to create a new MongoDB database in order to roll the data encryption key. It is actually much simpler. Here are the steps you use to roll the data encryption key (extracted from the MongoDB documentation for KMIP key management):

1) Rotate the master key for the secondary members of the replica set one at a time.

a. Restart the secondary, including the --kmipRotateMasterKey option. Include any other options specific to your configuration, such as --bind_ip. If the member already includes the --kmipKeyIdentifier option, either update the --kmipKeyIdentifier option with the new key to use or omit to request a new key from the KMIP server:

mongod --enableEncryption --kmipRotateMasterKey \

--kmipServerName <KMIP Server HostName> \

--kmipServerCAFile ca.pem --kmipClientCertificateFile client.pem

If using a configuration file, include the security.kmip.rotateMasterKey.

b. Upon successful completion of the master key rotation and re-encryption of the database keystore, the mongod will exit.

c. Restart the secondary without the --kmipRotateMasterKey parameter. Include any other options specific to your configuration, such as --bind_ip.

mongod --enableEncryption --kmipServerName \

<KMIP Server HostName> \

--kmipServerCAFile ca.pem \

--kmipClientCertificateFileclient.pem

 If using a configuration file, remove the security.kmip.rotateMasterKey.

2) Step down the replica set primary.

Connect a mongo shell to the primary and use rs.stepDown() to step down the primary and force an election of a new primary:

 rs.stepDown()

When rs.status() shows that the primary has stepped down and another member has assumed PRIMARY state, rotate the master key for the stepped down member:

a. Restart the stepped-down member, including the --kmipRotateMasterKey option. Include any other options specific to your configuration, such as --bind_ip. If the member already includes the --kmipKeyIdentifier option, either update the --kmipKeyIdentifier option with the new key to use or omit.

mongod --enableEncryption --kmipRotateMasterKey \

--kmipServerName <KMIP Server HostName> \

--kmipServerCAFile ca.pem \

--kmipClientCertificateFileclient.pem 

If using a configuration file, include the security.kmip.rotateMasterKey.

b. Upon successful completion of the master key rotation and re-encryption of the database keystore, the mongod will exit.

c. Restart the stepped-down member without the --kmipRotateMasterKey option. Include any other options specific to your configuration, such as --bind_ip.

mongod --enableEncryption --kmipServerName \

<KMIP Server HostName> \

--kmipServerCAFile ca.pem \

--kmipClientCertificateFileclient.pem

If using a configuration file, remove the security.kmip.rotateMasterKey setting.

This is pretty easy. You can find the documentation online at the MongoDB documentation site.

If you attended my class in New York, or if you are implementing encryption and key management for MongoDB, I hope you find this helpful. As soon as the recording of the class is available I’ll send it to you in a new blog.

Thanks to all of the MongoDB team members who put on the MongoDB World in New York! It was an exciting two days and I always learn a lot at the conferences.

Patrick

Encryption & Key Management for MongoDB eBook

Topics: MongoDB Encryption Key Management, MongoDB

Customer Information: IBM i Security Products Acquired by Syncsort

Posted by Patrick Townsend on May 21, 2018 1:11:00 PM

We are pleased to announce that Syncsort has acquired Townsend Security's IBM i security offerings. Solutions in the sale include Townsend Security's encryption, tokenization, authentication, FTP, as well as SIEM integration products. Syncsort is the global leader in Big Iron to Big Data software. The acquisition helps Syncsort deliver on its ongoing commitment to strengthening the security of the IBM i platform and to businesses that rely on its performance and resilience to drive the business forward.  

Townsend Security will continue in business focusing on Alliance Key Manager, our FIPS 140-2 compliant encryption key management solution that is in use by over 3,000 customers worldwide. Alliance Key Manager provides encryption key management and encryption services to a wide set of relational database solutions, big data solutions, and blockchain platforms. We are pleased that we will have an on-going partnership with Syncsort around providing the Townsend Security Alliance Key Manager solution to Syncsort customers.

During the transition, Townsend Security will work closely with Syncsort to make sure that your IBM i product needs are met. As part of the acquisition, we will be sharing your account information with Syncsort and anticipate a full handover within 6 months.

To learn more about Syncsort’s IBM i products and content offers, you can sign up here to be added to their mailing list.

We thank you for your business and trust you will continue to experience the same high level of customer satisfaction and support with Syncsort.

Patrick Townsend
Founder/CEO
360.359.4400

Topics: Customer Information

SQL Server Always Encrypted vs Transparent Data Encryption (TDE)

Posted by Patrick Townsend on Apr 30, 2018 10:42:39 AM

Microsoft SQL Server customers ask us whether they should use Always Encrypted or Transparent Data Encryption (TDE) to protect sensitive data. It is a relevant question, especially given the concern for the GDPR, and it is relevant to many other compliance regulations and data protection needs. Let’s explore these technologies in more detail and I think the answer will emerge.

Always Encrypted

Encryption & Key Management for SQL Server - Definitive GuideAlways Encrypted is a relatively new, client-side implementation of encryption for SQL Server. That is, data is encrypted on the source system before you insert it into a SQL Server database. Implemented as a Windows driver, Always Encrypted intercepts your SQL statement before it leaves your client-side system (PC, web server, etc.), determines if there are any fields in the SQL statement that need to be encrypted, establishes or retrieves an encryption key, encrypts the field, and sends the modified SQL statement to SQL Server for processing.

One advantage of this approach is that the data is encrypted as it travels over the internal or external network, and as it resides in the database. A retrieval of the encrypted data reverses this process ensuring that the data is protected in transit.

One significant limitation of Always Encrypted is that it can only be applied to a small subset of SQL operations. Many SQL operations are complex and cannot be processed by Always Encrypted.

Transparent Data Encryption and Cell Level Encryption

SQL Server Transparent Data Encryption (TDE) and Cell Level Encryption (CLE) are server-side facilities that encrypt the entire SQL Server database at rest, or selected columns. The client-side application is completely unaware of the implementation of TDE or CLE and no software is installed on the client-side system. All of the encryption tasks are performed by the SQL Server database itself. It is easy to implement and performs very well for most SQL Server customers.

TDE and CLE are a part of the Microsoft SQL Server Extensible Key Management (EKM) strategy which has been a part of SQL Server since the 2008 edition. It is a mature technology and many organizations deploy TDE or CLE to protect their sensitive information.

Encryption Key Management

An encryption strategy is only as good as the encryption key management strategy. Creating, protecting, and using encryption keys is the hardest part of encryption, and for the sake of your overall security, it is important to get key management right.

With Transparent Data Encryption the key management strategy is well defined through the Microsoft Extensible Key Management (EKM) Provider interface. Key management systems such as our Alliance Key Manager for SQL Server provide software written to the EKM interface specification and SQL Server customers get full encryption key management integrated through the EKM Provider interface.

Always Encrypted does not provide a formal interface for key management. Instead, third party vendors must provide Always Encrypted drivers that implement an interface to a key management system. This means that the interface to the key management system is proprietary. Additionally, since the Always Encrypted implementation is a client-side deployment, each client-side application will have to have access to the key manager in order to properly protect and use encryption keys. This can be a challenge in distributed network topologies.

The bottom line: you can achieve proper key management with either approach, but expect to deal with more complexity with Always Encrypted when you have distributed clients.

When to Use Always Encrypted

Because Always Encrypted functions by modifying the SQL operation before it interfaces with the SQL Server database, and because many complex SQL operations will not work with Always Encrypted, I would only recommend using Always Encrypted when the application architecture is very simple. For example, you might want to use Always Encrypted to send data from an internal SQL Server database to a web-hosted SQL Server database and application. The data will be protected in transit and encrypted in the database. As long as your web application makes simple SQL queries to the database this approach can work well.

When to use TDE or CLE

Transparent Data Encryption and Cell Level Encryption are implemented directly in the SQL Server database itself. This means that complex SQL operations are fully supported and you will not find limitations imposed by your encryption or application strategy in the future. The SQL Server implementation of TDE and CLE also support a standardized interface for encryption key management. This means you will have choices in key management vendors and can readily find a solution.  TDE also has good performance metrics for the large majority of SQL Server customers, and is exceptionally easy to implement.

Summary

For the reasons outlined above I believe that Transparent Data Encryption or Cell Level Encryption implemented within SQL Server will almost always be the better choice for most SQL Server customers. You will not be locked into restrictions on the use of SQL operations, key management is well defined, and the encryption performance metrics are impressive.

I hope this overview of the two SQL Server encryption approaches has been helpful. Please feel free to reach out with any questions you have.

Patrick

Encryption

Topics: SQL Server, Transparent Data Encryption (TDE)

GDPR - Do I have to Use Encryption?

Posted by Patrick Townsend on Apr 24, 2018 8:44:17 AM

As the date for the formal implementation of the EU General Data Protection Regulation draws near, many of our customers are struggling with the question of whether or not they have to encrypt sensitive data. Because the fines for violating the GDPR can be quite high, this question is taking on a growing urgency. So, let’s take a look at this question in more detail by looking at the actual GDPR source documents.

Download the EU Data Privacy White PaperThe most relevant part of the GDPR regulation related to encryption is Article 32 - “Security of Processing”. The actual text of the article is very readable and you can find a link in the Resources section below. Here is an extract from Article 32 (emphasis added):

“Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:

  1. the pseudonymisation and encryption of personal data;
  2. the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services;
  3. the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident;
  4. a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.”

Well, it looks like we don’t really have to encrypt the sensitive data because we get to take into account the costs of the implementation and the nature, scope, context and purpose for processing. Along with some other potentially mitigating factors. If you read no further you might draw the conclusion that encryption is a recommendation, but it is not a requirements. Question answered, right?

Not so fast. Let’s dig deeper. The next point in Article 32 shines a brighter light on this question:

“2. In assessing the appropriate level of security account shall be taken in particular of the risks that are presented by processing, in particular from accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to personal data transmitted, stored or otherwise processed.”

In effect the GDPR is saying that your security controls must account for the risk of accidental, unlawful, or unauthorized disclosure or loss of personal data. That is a very broad category of potential violations of the protection of an individual’s data. Have you ever lost a backup cartridge? Do you really think your systems are secure enough to prevent access by sophisticated cyber criminals?

While on first look it seems that we have some leeway related to the deployment of encryption, GDPR quickly raises the bar on this question. Given the current state of security of data processing systems, no security professional should be absolutely comfortable with the security of their systems.

If you are still thinking you can avoid encrypting sensitive data, be sure to take a read of Recital 78, “Appropriate technical and organisational measures”.

It should be clear by now that if you decide NOT to encrypt sensitive data you should definitely document all of the reasons it is not feasible or practical to do so, and all of the measures you are actually taking to protect that data. Put this in writing and get senior management sign-off on your conclusions.

But there is more.

If you are wondering how serious GDPR is about encryption, be sure to read Recital 83 “Security of processing”. Here is an extract with emphasis added:

“In order to maintain security and to prevent processing in infringement of this Regulation, the controller or processor should evaluate the risks inherent in the processing and implement measures to mitigate those risks, such as encryption. Those measures should ensure an appropriate level of security, including confidentiality, taking into account the state of the art and the costs of implementation in relation to the risks and the nature of the personal data to be protected. In assessing data security risk, consideration should be given to the risks that are presented by personal data processing, such as accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise processed which may in particular lead to physical, material or non-material damage.

If you are getting the notion that the authors of the GDPR really want you to encrypt sensitive data, you would be right.

Where else does encryption come into play?

There are safe-harbors in GDPR around data breach notification IF you are encrypting sensitive data. The avoidance of notification is not absolute, but here is one relevant section of Article 34, “Communication of a personal data breach to the data subject” (emphasis added):

The communication to the data subject referred to in paragraph 1 shall not be required if any of the following conditions are met:

  1. the controller has implemented appropriate technical and organisational protection measures, and those measures were applied to the personal data affected by the personal data breach, in particular those that render the personal data unintelligible to any person who is not authorised to access it, such as encryption;

If the sensitive data of a data subject is lost and not encrypted, it will be difficult to argue that the information is inaccessible. The loss of unencrypted data will certainly require notification to the supervisory authority and the data subject.

There is one more aspect to the discussion of encryption and that relates to the management of encryption keys. Your encryption strategy is only as good as your ability to protect your encryption keys. This is reflected in Recital 85 “Notification obligation of breaches to the supervisory authority” (emphasis added):

“A personal data breach may, if not addressed in an appropriate and timely manner, result in physical, material or non-material damage to natural persons such as loss of control over their personal data or limitation of their rights, discrimination, identity theft or fraud, financial loss, unauthorised reversal of pseudonymisation, damage to reputation, loss of confidentiality of personal data protected by professional secrecy or any other significant economic or social disadvantage to the natural person concerned. Therefore, as soon as the controller becomes aware that a personal data breach has occurred, the controller should notify the personal data breach to the supervisory authority without undue delay and, where feasible, not later than 72 hours after having become aware of it, unless the controller is able to demonstrate, in accordance with the accountability principle, that the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where such notification cannot be achieved within 72 hours, the reasons for the delay should accompany the notification and information may be provided in phases without undue further delay.”

If you are not properly protecting the encryption key used for encryption, it must be assumed that the encryption can be reversed. Don’t use weak encryption keys such as passwords, don’t store encryption keys in files or in application code. Instead, use a professional key management solution to protect the keys.

Returning to our original question about the need for encryption of sensitive data, I hope you have arrived at Yes as the most responsible answer. The loss of unencrypted sensitive data will definitely trigger the need for data breach notification. And the improper protection of encryption keys will also trigger the need for breach notification. You are at more risk of financial penalties if you are not properly protecting that sensitive information with encryption.

The GDPR is complex and some parts are subject to interpretation. But if you control or process sensitive data you should not underestimate the serious intent of the GDPR to enforce protections for individuals. GDPR is revolutionary and disruptive - it is dangerous to ignore it.

Patrick



Resources
The General Data Protection Regulation (GDPR)
The GDPR Recitals
GDPR Article 32 “Security of Processing"
Recital 78, “Appropriate technical and organisational measures”
Recital 83, “Security of processing”
GDPR Article 34, “Communication of a personal data breach to the data subject”
Recital 85 “Notification obligation of breaches to the supervisory authority”

EU Data Privacy Protections and Encryption

Topics: EU GDPR, Enryption

Top 3 Areas of Weakness when Undergoing a PCI Audit for the First Time

Posted by Sophia Khan - CyberGuard Compliance on Apr 17, 2018 8:21:00 AM

Guest blog by Sophia Khan of CyberGuard Compliance


CyberGuard-Word-CloudWith all of the security breaches you hear of in the news and the occurrence of these incidents becoming more widespread, how can you be sure your customer’s credit card information remains secure? This is the purpose of PCI DSS (Payment Card Industry Data Security Standards) as it pertains to all merchants who accept credit cards, irrespective of their annual revenue and credit card transaction volume. Even with this mandatory requirement, a vast majority of organizations are still struggling to maintain PCI compliance, and the process is costing companies a great deal in consulting fees to address the root cause of PCI audit failures. By being proactive in assessing the areas of weakness when undergoing a PCI Audit, especially for the first time, companies can avoid the frustrations of a failed audit and be well on their way to continued PCI compliance.

FIRST TIME PCI AUDIT AREAS OF WEAKNESS:

 

1) Encryption Key Management

Several sections of PCI DSS address cryptography and key management with respect to the protection of cardholder data. When a company is unfamiliar with the ever-evolving encryption standards and requirements, this can provide additional challenges in maintaining PCI compliance. PCI DSS does not specify which cryptographic standards should be utilized, however most companies do implement Advanced Encryption Standard (AES) and Triple Data Encryption Standard (TDES), which are widely accepted for the encryption of sensitive data. These are the standards approved by the National Institute of Standards and Technology (NIST).

The list of requirements includes the protection of cryptographic keys utilized for the use of encryption of cardholder data by limiting access to the fewest number of custodians necessary, and the secure storage of keys in as few locations as possible. Additionally, cryptographic keys generated need to be strong with secure key distribution and storage, with periodic key changes being mandatory. For example, if a vendor hosts your encryption keys, then they along with hackers can most likely access it. Therefore, it is essential for maximum security to exclusively host your own encryption keys.

The solution to the complex issue of encryption key management, is the utilization of encryption and tokenization tools which will protect sensitive information with NIST-validated AES database encryption. There are five fundamentals of encryption key management:

  1. Key storage
  2. Key policy management
  3. Key authentication
  4. Key authorization
  5. Key transmission

By implementing a strong key management application, secure data can be protected and there will be prevention of third-parties from accessing unencrypted content.  

2) Two-Factor Authentication

Passwords are typically the first and final line of defense in protecting user accounts. They are also the easiest for hackers to crack and on their own can compromise the security of sensitive information. Two-factor authentication, also known as multi-factor authentication, significantly reduces the risk of system intrusion, as you are required to utilize both a password as well as an additional measure such as a security code sent to a mobile device. PCI security standards require at least two of three authentication methods described in the requirement to be utilized.

  • Something you know: This utilizes knowledge of something such as a password, PIN, or phrase.
  • Something you have: This may involve an RSA token device, smartcard, key fob, or cellular device with mobile authentication.
  • Something you are: This involves biometric measures such as a fingerprint or retina scan, facial or voice recognition, or other unique physical identification.

By utilizing two-factor authentication, companies can reduce the security threat of possible intrusion with added authentication mechanism beyond a simple password.

3) System Logging

Section 10 of the PCI DSS indicates organizations must track and monitor all access to network resources and cardholder data. This is one of the most important PCI compliance requirements as it related to network security and access. The requirement has many subsections outlining what needs to be fulfilled in order to maintain compliance in this section.

The following must be logged and maintained:

  • There must be a system for logging access to all system components and every individual user.
  • Audit trails for each of the following must be established:
    • Individual users with cardholder data;
    • All actions performed by users with administrative privileges;
    • All invalid access attempts;
    • Tracking of audit log clearings;
    • Secure assessment trail logs and access restriction to those with a job-related need.

System logging capabilities and the function to track user activities are imperative in detecting, preventing, and diminishing the potential impact of a data security breach. In the absence of system activity logs, it would be near impossible to detect the cause of a compromise and correct it. By maintaining knowledge of who had access to the system what they used the data for, a company can be proactive in the event cardholder data goes missing or there is suspicion of any foul play.

In recent times, many organizations have fallen into the ‘check the box’ category, where PCI compliance is simply a mandatory prerequisite they try and get through for the sake of fulfilling a minimum requirement. By practicing good cybersecurity hygiene on a consistent basis, and being cognizant of the potential areas of weakness, your Company can ensure cyber-risk management is a continued priority as you minimize operational risks. This will also allow your firm to stay ahead of the curve as technology evolves when the standard updates with time.

Visit CyberGuard Compliance’s website to learn more about their services relating to PCI audits.

About CyberGuard Compliance
CyberGuard Compliance is based in the United States, but serves clients around the globe. The firm’s leadership team has over 150 years of combined business management, operations and related information technology (IT) experience. CyberGuard Compliance has performed over 1,000 SOC audits, and unlike most traditional CPA firms which focus on financial statement auditing and tax compliance, CyberGuard Compliance focuses on cybersecurity and compliance related engagements. These engagements include, but are not limited to, SOC 1 Audits, SOC 2 Audits, SOC 3 Audits, SOC Readiness Assessments, ISO 27001 Assessments, PCI Compliance, HIPAA Compliance, HITRUST Compliance, Vulnerability Assessments, and Penetration Testing. For more information, please visit http://www.cgcompliance.com.

Topics: Compliance, PCI DSS

Migrating from Oracle to MongoDB with Data Security

Posted by Patrick Townsend on Mar 20, 2018 2:38:51 PM

With big cost savings in mind, many Oracle Database customers are migrating selected applications to MongoDB. Let’s take a look at some of the factors that are driving this migration, and how MongoDB helps protect and secure data after the migration.

First, how do customers protect data that is stored in an Oracle Database?

Encryption & Key Management for MongoDB eBookOracle Database supports Transparent Data Encryption (TDE) through a special interface implemented in Oracle Advanced Security. Once you’ve made the costly upgrade to Oracle Advanced Security, you can configure and activate encryption of the Oracle Database through the Advanced Security Wallet application. The encryption support can be either whole database encryption through Transparent Data Encryption or column level encryption. If you are using the Oracle Key Manager (OKM) solution, or a third-party key manager, the interface in Oracle Wallet is usually through a PKCS#11 interface library. This is simply a software library that uses the key manager to protect the local key used for database encryption.

The increase in cost to deploy Oracle Advanced Security can be substantial. It varies depending on your licensing agreement with Oracle or one of their distributors, but it affects the cost basis for each database user.

Faced with the increased cost of deploying Oracle Advanced Security, or wishing to achieve cost savings with existing Oracle Database implementations, Oracle customers are moving to MongoDB Enterprise which has a different security strategy.

So,  how do MongoDB customers protect data in an equivalent fashion?

MongoDB Enterprise edition also supports Transparent Data Encryption of the database. But this is a native part of the Enterprise edition of MongoDB and does not involve additional license costs. Additionally, MongoDB Enterprise overall pricing is substantially lower that Oracle Database [Organizations are saving 70%+ by switching from Oracle to MongoDB]. Without the need for increased costs to get the encryption option, MongoDB customers are able to achieve Transparent Data Encryption in an affordable way.

Additionally, encryption key management is based on the industry standard Key Management Interoperability Protocol (KMIP) as defined by the OASIS standards group. Most enterprise key management systems, including our Alliance Key Manager for MongoDB solution, support KMIP. This means that customers can easily deploy Transparent Data Encryption with an affordable key management solution that plugs into MongoDB through the KMIP interface. This will make security-conscious Oracle Database customers happy in achieving an equivalent level of security after the migration.

MongoDB has been doing a lot lately to woo Oracle Database customers to its platform. While MongoDB is a NoSQL big data database, it supports SQL-like interfaces to the database. This means that migrations from Oracle to MongoDB are pretty straight-forward [A Step-by-Step Guide on How to Migrate from RDBMS to MongoDB].

MongoDB also just announced a key feature that will make Oracle Database customers happy. That announcement is that MongoDB version 4 will fully support Atomicity, Consistency, Isolation and Durability (ACID) as a core feature of MongoDB. This is database geek-speak that means that MongoDB database applications will work with the same reliability as big commercial SQL databases - something that Oracle Database customers expect. I think this will help accelerate the migration from Oracle Database to MongoDB Enterprise database.

Oracle Database customers typically run a wide range of packaged and custom-built applications. I believe that early migrations from Oracle to MongoDB mostly involve those custom-built applications that are relatively simple in their architecture. Migrating from an Oracle ERP or CRM application might involve a great deal of planning and re-engineering. But I think almost every Oracle customer has some custom-built applications that are easy targets for a migration to MongoDB.

The cost savings combined with the ease of migration, the equivalent level of database encryption and key management security, and the new support for ACID database reliability in MongoDB version 4 is a winning combination for MongoDB as a company. I suspect that migrations from Oracle to MongoDB will pick up dramatically this year as MongoDB version 4 roles out.

Here at Townsend Security we’ve done a lot to align with the MongoDB Enterprise strategy. We’ve formally certified our Alliance Key Manager solution with the MongoDB security team, we support both Intel and Power platforms for our key management interface, we deploy in cloud, VMware and on-premise environments, and we’ve aligned our pricing and licensing strategy with the MongoDB strategy. Entry level MongoDB customers can deploy compliant (PCI DSS, FIPS 140-2) key management in an affordable manner, and key management licensing follows the MongoDB model. Even very large MongoDB Enterprise customers will be happy with our key management licensing, scalability, and pricing strategy.

I am impressed with MongoDB’s goal of bringing enterprise level database technology to a wide variety of small, medium, and large Enterprise customers. It’s just the right time for a new, disruptive approach to databases.

If you'd like to learn more about how we are helping secure MongoDB Enterprise with encryption and key management, visit our Alliance Key Manager solution for MongoDB page.

Patrick

Encryption & Key Management for MongoDB eBook

Topics: MongoDB

High Availability Strategies for MongoDB Encryption Key Management

Posted by Patrick Townsend on Mar 2, 2018 10:27:59 AM

The MongoDB database is designed to be resilient and gives you several options for high availability and business continuity. But what about encryption key management - how do we implement high availability when we’ve deployed a key management system for MongoDB encrypted databases? Here at Townsend Security we encounter this question with MongoDB users on a regular basis. Let me share some of the approaches that we recommend.

We can take a GOOD, BETTER, and BEST view of key management high availability. Not every MongoDB database implementation needs extreme high availability, and there are costs and benefits for different approaches. So let’s run down some of the common options:

GOOD

Not everyone uses the MongoDB database for mission critical applications, and hot failover is not needed. But, of course, in the event of a disaster that takes down your data center, network, or servers, you want to be able to recover in a reasonable amount of time. In this case, you can rely on the backup and restore capabilities of Alliance Key Manager for MongoDB. Here is a diagrammatic example of a simple case of a primary and secondary MongoDB implementation that share a single key manager:

MongoDiagram-good.png

In this case we deploy a single key manager to serve both the primary and secondary nodes of a MongoDB implementation. After initializing the MongoDB database for encryption, we perform a key manager backup to archive the master encryption key for the primary and secondary nodes of the database.

Recovery of the MongoDB database may involve migration from the secondary node to the primary node when it is back online, or restoration from a backup image, and a restore operation for the key manager. Alliance Key Manager makes it easy to backup the encryption key database and configuration, and to restore from backup when needed. This is the simplest case of key management recovery when hot failover is not needed.

Remember to follow security best practices when backing up Alliance Key Manager. You will want to save the secret keys separately from the data encryption keys, and ensure separation of duties. See the Alliance Key Manager documentation for guidance on backup and restore operations.

BETTER

For many MongoDB customers the applications built on the MongoDB database represent core, mission critical applications that must be available at all times. These customers need a high availability strategy that guarantees very little loss of availability and rapid recovery. While there are different failover strategies for MongoDB customers, the normal approach would be to failover to a secondary MongoDB node at a geographically independent data center. The primary and secondary nodes would use separate key management systems which would be synchronized. A diagrammatic might look something like this:

MongoDiagram-Better.png

The primary MongoDB node has a key manager deployed in its data center or cloud location, and the secondary MongoDB node has a different key manager deployed in its data center or cloud location. The two key managers are mirroring encryption keys in real time in an active-active configuration. Both the MongoDB data and the Alliance Key Manager instances are fully redundant in real time.

BEST

Enterprise customers who build mission critical applications on MongoDB databases and who must have full business continuity and high availability failover can achieve this with MongoDB replication and redundant Alliance Key Manager servers. A primary MongoDB node can associate two redundant key servers, and a secondary replicating MongoDB node can associate two different redundant key servers. Since MongoDB configuration only allows for the definition of a single key server, we can use a load balancer to implement the redundant key management pair. A diagram of this configuration would look like this:

MongoDiagram-Best.png

With a load balancer placed between the MongoDB database and the two key managers you can achieve hot failover in the event of a lost connection to the first key server without loss of access to the database. When the connection to the main key server is restarted the load balancer will bring it on line. The two Alliance Key Management servers automatically mirror encryption keys to each other in an active-active configuration.

In the event of a full loss of the primary MongoDB database the failover to the secondary MongoDB database will occur by the MongoDB Arbiter. The fully replicated data will be available and the secondary database will be protected by a pair of Alliance Key Manager servers in the same was as the primary MongoDB database.

Note that there can be multiple secondary MongoDB nodes and each can implement a similar key management failover strategy. With the above strategy MongoDB database customers can achieve a very high level of business continuity and high availability failover.

Additional Considerations

MongoDB database deployments vary a great deal in their overall architecture and implementation. This is a testament to the flexibility of the database and its ability to meet a wide variety of customer use case scenarios. Alliance Key Manager for MongoDB can help improve security and recoverability in any MongoDB deployment.

Alliance Key Manager is available has a Hardware Security Module (HSM), VMware software appliance (virtual machine), and as cloud instances. The interface to all of the key managers works in exactly the same way. This means you can create hybrid deployments of MongoDB and Alliance Key Manager across clouds, and between cloud and on-premise deployments.

At the time this blog was written (March 2018) the MongoDB Atlas cloud platform did not support independent third party key management solutions through the KMIP interface. That is likely to change in the future. For Enterprise customers who must achieve exclusive custody of encryption keys, you can deploy MongoDB in a normal cloud instance and use the encryption and key management capabilities of MongoDB with Alliance Key Manager. You can then migrate to the Atlas service when it supports the KMIP interface for key management.

About Alliance Key Manager

Alliance Key Manager for MongoDB is certified by the MongoDB security team, and supports the MongoDB Enterprise pricing model. Regardless of the size of your MongoDB implementation you will find an affordable and easy-to-deploy Alliance Key Manager for MongoDB solution.

Securing Data in MongoDB

Image Credit:
Load balancer created by AlexWaZa from the Noun Project
Key created by icon 54 from the Noun Project

Topics: MongoDB Encryption

What Data Needs to Be Encrypted in MongoDB?

Posted by Luke Probasco on Feb 23, 2018 8:11:00 AM
A Checklist for Meeting Compliance

 

What Information Do I Need to Protect with Strong Encryption?

compliance-webinar.jpgOrganizations starting an encryption project always have this question on their minds. It is a simple question, but can be hard to answer. Generally speaking, you should encrypt any information that alone, or when combined with other information, can identify a unique, individual person. This is called Personally Identifying Information, or PII. This should be your starting point, but you may need to address other information depending on the compliance regulations you must meet.

[For even more information on encrypting data in MongoDB, view our Definitive Guide to MongoDB Encryption & Key Management.]


Quicklinks:

Federal/State Laws and Personally Identifiable Information (PII)

EU General Data Protection Regulation (GDPR)

Educational Information Covered by FERPA

Federal Agencies and FISMA

Medical Information for Covered Entities and HIPAA/HITECH

Payment Card Data Security Standard (PCI DSS)

Financial Data for FFIEC Compliance

 

Federal/State Laws and Personally Identifiable Information (PII)

Federal and State laws vary in terms of what they consider Personally Identifiable Information (PII), but there is a lot of commonality between them. PII is any information which either alone or when combined with other information, which can identify an individual person. Start with this list of data items:

  • Social security number
  • Credit card number
  • Bank account number
  • First name
  • Last name
  • Address
  • Zip code
  • Email address
  • Birth date
  • Password or passphrase
  • Military ID
  • Passport
  • Drivers license number
  • Vehicle license number
  • Phone and Fax numbers

 

EU General Protection Regulation (GDPR)
Under the GDPR youmust protect the personal data of an individual. The definition of “personal data” is quite broad and includes “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person”. This includes, but is not limited to:
  • Social security number
  • Credit card number
  • Bank account number
  • First name
  • Last name
  • Address
  • Zip code
  • Email address
  • Medical information
  • Birth date
  • Password or passphrase
  • Military ID
  • Passport
  • Drivers license number
  • Vehicle license number
  • Phone and Fax numbers
Personal data is broadly defined so an excess of caution should be applied to protection of individual information.
 
 
Educational Information Covered by FERPA

Educational institutions who fall under the FERPA regulations must protect Personally Identifiable Information (see above) as well as the following information:

  • Student name
  • Student ID number
  • Family member names
  • Place of birth
  • Mother’s maiden name
  • Student educational records
  • Immunization records
  • Health records
  • Individuals with Disabilities (IDEA) records
  • Attendance

 

Federal Agencies and FISMA

Federal agencies must evaluate their systems for the presence of sensitive data and provide mechanisms to insure the confidentiality, integrity and availability of the information. Sensitive information is broadly defined, and includes Personally Identifiable Information (see above), as well as other information classified as sensitive by the Federal agency. Sensitive information might be defined in the following categories:

  • Medical
  • Financial
  • Proprietary
  • Contractor sensitive
  • Security management
  • And other information identified by executive order, specific law, directive, policy or regulation
 
Medical Information for Covered Entities and HIPAA/HITECH
The HIPAA / HITECH Act defines Protected Health Information to include Personally Identifying Information (see above) in addition to the following Protected Health Information (PHI):
  • Patient diagnostic information (past, present, future physical or mental health)
  • Patient treatment information
  • Patient payment information
  • Medical record numbers
  • Name
  • Street Address
  • City
  • Zip code
  • County
  • Health plan beneficiary numbers
  • Fingerprints and other biometric identifiers
  • Full facial photographs and images
  • Device identifiers and serial numbers
  • IP address numbers and web URLs
  • Any other individual identifiable information
 
Payment Card Data Security Standard (PCI DSS)
The Payment Card Industry Data Security Standards (PCI DSS) require that merchants protect sensitive cardholder information from loss and use good security practices to detect and protect against security breaches.
 
If you accept or process credit card or other payment cards, you must encrypt the following data:
  • Primary Account Number (PAN)
You must also NOT store, even in encrypted format:
  • Track 1 and Track 2 data
  • Security codes (CVV, CVV2, etc.)
 
Financial Data for FFIEC Compliance
Banks, credit unions, and other financial institutions must protect Non-public Personal Information (NPI) which includes personally identifying financial information (see above). In addition to Personally Identifying Information above, you should protect:
  • Income
  • Credit score
  • Collection history
  • Family member PII and NPI

 

Encrypting Data in MongoDB
mdb-enterprise-certified-technology-partner_300x660.pngTownsend Security is helping the MongoDB community encrypt sensitive data and properly manage encryption keys. Developers who need to protect sensitive data know that storing their encryption keys within MongoDB puts their data at risk for a breach. With Alliance Key Manager for MongoDB, administrators are now able to keep their encryption keys secure by storing them remotely and only accessing them when the encryption/decryption happens. 

Alliance Key Manager for MongoDB is an enterprise key management solution that allows you to easily encrypt sensitive data with NIST-validated AES encryption and securely retrieve and manage encryption keys from Townsend Security’s FIPS 140-2 compliant Alliance Key Manager. With an easy to use interface and certifications to meet compliance requirements, you can rest assured knowing your data is secure.
 
Encryption and key management for MongoDB
 

 

LIMIT OF LIABILITY/DISCLAIMER OF WARRANTY:

THE PUBLISHER, THE AUTHOR, AND ANYONE ELSE INVOLVED IN PREPARING THIS WORK MAKE NO REPRESENTATIONS OR WARRANTIES WITH RESPECT
TO THE ACCURACY OR COMPLETENESS OF THE CONTENTS OF THIS WORK AND SPECIFICALLY DISCLAIM ALL WARRANTIES, INCLUDING WITHOUT LIMITATION WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE. NO WARRANTY MAY BE CREATED OR EXTENDED BY SALES OR PROMOTIONAL MATERIALS. THE ADVICE

AND STRATEGIES CONTAINED HEREIN MAY NOT BE SUITABLE FOR EVERY SITUATION. THIS WORK IS SOLD WITH THE UNDERSTANDING THAT THE PUBLISHER IS NOT ENGAGED IN RENDERING LEGAL, ACCOUNTING,
OR OTHER PROFESSIONAL SERVICES. IF PROFESSIONAL ASSISTANCE IS REQUIRED, THE SERVICES OF A COMPETENT PROFESSIONAL PERSON SHOULD BE SOUGHT. NEITHER THE PUBLISHER NOR THE AUTHOR SHALL BE LIABLE FOR DAMAGES ARISING HEREFROM. THE FACT THAT AN ORGANIZATION OR WEBSITE IS REFERRED TO IN THIS WORK AS A CITATION AND/OR A POTENTIAL SOURCE OF FURTHER INFORMATION DOES NOT MEAN THAT THE AUTHOR OR THE PUBLISHER ENDORSES THE INFORMATION THE ORGANIZATION OR WEBSITE MAY PROVIDE OR RECOMMENDATIONS IT MAY MAKE. FURTHER, READERS SHOULD BE AWARE THAT INTERNET WEBSITES LISTED IN THIS WORK MAY HAVE CHANGED OR DISAPPEARED BETWEEN WHEN THIS WORK WAS WRITTEN AND WHEN IT IS READ.

Topics: Compliance, MongoDB

Press Release: Alliance Two Factor Authentication for IBM i Now Supports the New PCI Standard for 2FA with Authy

Posted by Luke Probasco on Jan 30, 2018 8:20:18 AM

IBM i (iSeries, AS/400) users can now meet PCI security recommendations for multi-factor authentication with a mobile-based solution.

Today Townsend Security announced a major enhancement to Alliance Two Factor Authentication for IBM i to fully support the new Payment Card Industry (PCI) recommendations for multi-factor authentication with Authy. Authy (A Twilio company) is one of the most popular mobile-based authentication solutions and is in wide use to protect web credentials.

2FA.pngTownsend Security’s support for Authy means that IBM i (iSeries, AS/400) users can now deploy a popular and low-cost two factor authentication product without the expense of back-end hardware servers and hardware tokens. The Authy application installs on your mobile device or in your browser and provides Time-based One Time Passwords (PIN codes) on demand. Since Authy TOTP codes do not require a mobile network connection or an Internet connection, they are immune from gaps in connectivity to the network. Authentication on the IBM i platform simply requires opening the Authy application on your phone, viewing the one time code, and entering it on your IBM i signon screen. Alliance Two Factor Authentication then verifies the code with the Authy service and allows access to the IBM i platform.

Alliance Two Factor Authentication also now implements multi-factor authentication that is compliant with the new PCI guidance which requires that a user enter a user ID and password (something they know) at the same time that they enter their one time code generated by Authy on the mobile device (something you have). The Townsend Security solution implements a secondary user ID and password to use with Authy authentication to meet this level of compliance. A failed authentication on the IBM i server never discloses whether the user ID and password were invalid, or whether the one time code was invalid. This logic prevents the disclosure of important credential information that is common in Two Step Verification. An additional benefit to using the Authy application is that recovery from the loss of a mobile phone is simple and straightforward.

Because Authy uses a secure, time-based one time code and does not use SMS text delivery, it is secure and meets security best practices for authentication. Townsend Security’s Alliance Two Factor Authentication solution continues to support SMS text delivery of one time codes, but the new Authy facility is the default for new installations.

“IBM i users need an affordable two factor authentication solution that removes the expense and headaches of hardware-based solutions. By using your mobile phone for the generation of one time codes, you never have to worry about administering a large number of hardware tokens,” said Patrick Townsend, CEO of Townsend Security “The Authy service is secure, extremely affordable, easy to administer, and highly performant. IBM i customers can install Alliance Two Factor Authentication in a few minutes, provision an Authy account on their web site, and be using two factor authentication very quickly. It’s a fast path to PCI compliance and better security.”

You can find the PCI guidance document here.

Alliance Two Factor Authentication is licensed on a per logical partition (LPAR) basis, with perpetual and subscription licensing options available. Existing Alliance Two Factor Authentication customers on a current maintenance contract can upgrade to the new version at no charge.

Two Factor Authentication on the IBM i

 

Topics: Alliance Two Factor Authentication, Press Release

IBM i FieldPROC Encryption, IBM Query, and Encrypted Indexes

Posted by Patrick Townsend on Jan 29, 2018 8:31:08 AM

The IBM i DB2 database team implemented column level encryption through Field Procedures (FieldProc) in release 7.1 of the IBM i operating system. It was a great step forward for those IBM i customers who need to encrypt sensitive data to meet compliance regulations and to improve overall security. With release 7.1 it was now possible to implement DB2 encryption without modifying user applications.

Request the Webinar on Top 5 Encryption Myths for IBM i UsersPrior to this enhancement to DB2 in release 7.1, implementing encryption could be a laborious process of modifying applications and performing extensive regression testing. This approach did not work well with some types of fields (date, time, etc.) and many IBM and third-party query utilities just did not work correctly. So the DB2 enhancement for Field Procedures was a great step forward.

While FieldProc worked well with native SQL applications in release 7.1, there were limitations for older RPG and COBOL applications, and many IBM query utilities did not work correctly with encrypted indexes. Many IBM i customers use IBM and third-party query programs for rapid development of reports and displays of data. Some customers that I’ve talked to have thousands of queries in their application mix, so limitations of IBM query with FieldProc represented an insurmountable challenge for many. When FieldProc was used to encrypt an index or key field, queries just would not work correctly and data was missing or out of order in reports and displays.

But there is some good news!

Starting with the 7.2 release of the IBM i operating system, many of the IBM query applications were updated to work with the native DB2 SQL Query Engine (SQE) by default. The SQL Query Engine has never had a problem with encrypted indexes. This means that the IBM query applications now work seamlessly with data that is encrypted with FieldProc in DB2. You can fully deploy column level encryption across multiple index columns with FieldProc, and your queries will work fine.

Many IBM i customers experimented with FieldProc in the first release in version 7.1 of the operating system and decided to take a pass. If you had that experience it is time to take another look at DB2 FieldProc encryption. The current version of the IBM i operating system is 7.3 and most IBM i customers have upgraded to this release. You now have good support for IBM queries, the SQL Query Engine, and DB2 FieldProc encryption.

While some challenges remain for older OPM and ILE RPG applications, we’ve been able to help a number of customers meet these challenges.

Encryption of data is a core part of a defense-in-depth strategy. We have to do a lot of things to protect our IBM i data, and one of those things is to encrypt the data at rest with industry standard encryption algorithms. DB2 Field Procedures provides the path to achieving this.

To read more about IBM i support for SQL Query Engine in query applications such as RUNQRY, OPNQRYF, and others, see this link.

Our Alliance AES/400 Encryption solution provides full support for DB2 Field Procedures, is easy to deploy, and affordable for every IBM i customer. 

For industry standard encryption key management you can deploy our Alliance Key Manager solution which is seamlessly integrated with DB2 Field Procedure encryption.

Patrick

Key Management for IBM i - Sources of Audit Failures

Topics: Encryption, IBM i, FIELDPROC

 

 

Subscribe to Email Updates

Recent Posts

Posts by Topic

see all