+1.800.357.1019

+1.800.357.1019

Feel free to call us toll free at +1.800.357.1019.

If you are in the area you can reach us at +1.360.359.4400.

Standard support
6:30am - 4:00pm PST, Monday - Friday, Free

Premium support
If you own Townsend Security 24x7 support and
have a production down issue outside normal
business hours, please call +1.800.349.0711
and the on-call person will be notified.

International customers, please dial +1.757.278.1926.

Townsend Security Data Privacy Blog

Microsoft SQL Server Encryption Key Management

Posted by Patrick Townsend on Mar 20, 2017 2:01:48 PM

The hardest part of an encryption strategy is the proper management of encryption keys. Failing to protect encryption keys puts protected data at risk, and fails to meet security best practices and compliance regulations. For Microsoft SQL Server customers who have already implemented Transparent Data Encryption (TDE) or Cell Level Encryption (CLE) the biggest cause of an audit failure is the lack of good encryption key management.

Encryption-Key-Management-SQL-Server This is the fourth in a series on the topic of Microsoft SQL Server encryption. Let’s look at some of the characteristics of good encryption key management for SQL Server.

Extensible Key Management (EKM) Providers
As we’ve discussed previously it is the responsibility of key management vendors to provide the Extensible Key Management (EKM) Provider software that is installed and registered to the SQL Server database enabling either TDE or CLE encryption. The software from the key management vendor is installed on the SQL Server instance and provides both encryption and key management services. The SQL Server database administrator does not need to be involved in the actual retrieval of an encryption key - that is the job of the EKM Provider software.

EKM Provider software must handle the encryption and decryption of the database key for Transparent Data Encryption, and must handle the retrieval of a symmetric key for Cell Level Encryption. Key retrieval should be performed in a manner that protects the encryption key from loss on the network, protects the key while in memory, and should properly log the key retrieval event in a system log repository. Encryption key retrieval is normally protected through the use of a secure TLS network connection between the EKM Provider software on SQL Server and the key manager hardware or virtual machine. There are many other critical aspects of EKM Provider key management implementations, and these will be discussed in a future series.

Enterprise Key Management Solutions
The proper generation, storage, protection and management of encryption keys is the core purpose of professional encryption key management solutions. As security devices an encryption key manager is responsible for creating strong encryption keys that meet industry standards, and protecting those keys from loss during the lifecycle of the keys. Encryption key managers may be hardware security modules (HSMs), virtual servers (VMware, Hyper-V, etc.), or multi-tenant or dedicated cloud instances. In addition to implementing industry standards for encryption key management, key servers will provide a variety of authentication, systems management, and audit functions to meet security best practices and compliance regulations. Microsoft SQL Server customers who want to achieve compliance with common regulations should look to deploy a professional, certified and validated key management solution.

Key Management Industry Standards
Encryption key management systems are cryptographic modules that perform a variety of functions. As a cryptographic module they fall under the standards of the National Institute of Standards and Technology (NIST) and key managers should provably meet NIST standards. The relevant NIST standard for encryption key management is the Federal Information Processing Standard 140-2 (FIPS 140-2), “Security Requirements for Cryptographic Modules”. Key management solutions which implement FIPS 140-2 standards will insure the generation of strong encryption keys, the protection of those keys from corruption or substitution, and the implementation of encryption that provably meets NIST cryptographic standards.

In addition to provide standards for encryption key management NIST also provides a method for vendors to validate that their solutions meet the standard. Encryption key management solutions are tested by chartered security testing laboratories and solutions are then approved directly by NIST. NIST publishes the solutions that have passed FIPS 140-2 testing and Microsoft SQL Server customers should look for FIPS 140-2 validation of any key management solution used to protect the database.

Migrating Locally Stored Keys to Key Management
Many Microsoft SQL Server users start their encryption projects by using the option to locally store the database encryption key on the local SQL Server instance. While this is not a security best practice, it is a common way to start an encryption project.

Fortunately, it is easy to migrate a locally stored encryption key to a proper key management solution. The migration involves the protection of the SQL Server database key to key management protection and does not require the decryption of the database. The database key which is currently protected by local keys and certificates is placed under the protection of the key manager. The EKM Provider software of your vendor then becomes responsible for unlocking the database key (TDE) or retrieving the symmetric key for Cell Level Encryption (CLE).

OASIS Key Management Interoperability Protocol (KMIP)
Many SQL Server customers ask about the KMIP standard for integrating with key managers. While KMIP is important for many reasons, it does not apply to the Microsoft EKM Provider interface. The EKM Provider interface leaves it to the key management vendor to perform the needed cryptographic functions on the key server. These functions do not map to KMIP operations and attributes. While it is advisable to deploy key management solutions that meet KMIP standards, it is not required for SQL Server encryption.

To this point we have defined the SQL Server encryption architecture, options for implementing SQL Server encryption (TDE and CLE), and basic requirements for encryption key management. In the next part of this series we will look at EKM Provider implementation topics as well as business continuity topics.

Patrick
Encryption and Key Management for Microsoft SQL Server

Topics: SQL Server, SQL Server encryption

Case Study: Citizens Security Life Insurance

Posted by Luke Probasco on Mar 13, 2017 10:54:24 AM

CSLI-Logo.pngCompliance Made Easy - Protecting Private Information with Alliance AES/400 Encryption for IBM i and Alliance Key Manager for VMware


“Townsend Security was extremely easy to work with - from the sales process to deploying our proof of concept to post-sales support.”

- Adam Bell, Senior Director of IT

 
Citizens Security Life Insurance

M Citizens Security Life Insurance Company is a life and health insurance carrier. The company offers group benefits including dental and vision coverage, and individual ancillary insurance products. The company was founded in 1965 and is headquartered in Louisville, Kentucky.

The Challenge: Protect ePHI & PII on the IBM i

In order to meet growing partner requirements and pass a data security audit for protecting electronic Protected Health Information (ePHI) and Personally Identifiable Information (PII), Citizens Security Life Insurance (CSLI) needed to deploy an encryption solution on the IBM i. The solution needed to be easy to implement with excellent performance.

While FIELDPROC on the IBM i makes it very easy to encrypt data without application changes, CSLI also understood that for encrypted data to truly be secure, they would need to store and manage encryption keys with an external key manager.

By using a VMware-based encryption key manager, the company could meet encryption and key management best practices for separating encryption keys from the data they protect.

The Solutions

Alliance AES/400 Encryption

“The performance we are seeing with Alliance AES/400 encryption is excellent,” said Adam Bell, Senior Director of IT, Citizens Security Life Insurance. “The solution was easy to integrate and completely met our expectations.”

Alliance AES/400 FIELDPROC encryption is NIST-compliant and optimized for performance. The solution is up to 100x faster than equivalent IBM APIs on the IBM i platform.

With Alliance AES/400, businesses can encrypt and decrypt fields that store data such as credit card numbers, social security numbers, account numbers, ePHI, and other PII instantly without application changes.

Alliance Key Manager for VMware

Alliance Key Manager for VMWare was very easy to implement and the resources Townsend Security provided made deployment a smooth process,” continued Bell. By deploying Alliance Key Manager for VMware, CSLI was able to meet their business needs with a solution that could not only deploy quickly, but was also easy to set up and configure.

Alliance Key Manager for VMware leverages the same FIPS 140-2 compliant technology found in Townsend Security’s hardware security module (HSM) and in use by over 3,000 customers. The solution brings a proven and mature encryption key management solution to VMware environments, with a lower total cost of ownership. Additionally, the key manager has been validated to meet PCI DSS in VMware environments.

Integration with the IBM i Platform

An encryption strategy is only as good as the key management strategy, and it can be difficult to get key management right. For companies doing encryption, the most common cause of an audit failure is an improper implementation of key management. The seamless integration between Alliance AES/400 and the external Alliance Key Manager for VMware allowed CSLI to pass their data security audit with flying colors.

“The relationship we developed with Townsend Security enabled us to have a painless sales and support process, and in turn, enabled us to easily pass our data security audit,” finished Bell.

Meeting HIPAA and protecting ePHI with encryption and key management.

 

Topics: Alliance Key Manager, Alliance AES/400, Case Study

A Brief History of KMIP

Posted by Ken Mafli on Mar 6, 2017 1:31:39 PM

KMIP Logo.pngKey Management Interoperability Protocol (KMIP) is quickly becoming the industry standard for ensuring your product or software can communicate seamlessly with cryptographic key managers.  In fact, a study by the Ponemon Institute in 2013 reported on the state of encryption trends and found that “more than half of those surveyed said that the KMIP standard was important in cloud encryption compared with 42% last year.”  This is surprising since KMIP v1.0 was first ratified three short years earlier on October 1st, 2010!

How Did it All Start?

eBook: Definitive Guide to Encryption Key Management The first meeting held to start discussing the new set of standards was on April, 24th 2009 in San Francisco in conjunction with the RSA convention that year.  In attendance were representatives from RSA, HP, IBM, Thales, Brocade, and NetApp. Their initial scope was to “develop specifications for the interoperability of key management services with key management clients. The specifications will address anticipated customer requirements for key lifecycle management”

But why was KMIP necessary to begin with?  The short answer: more and more organizations were deploying encryption in multiple environments.  But with encryption comes the need to properly manage the encryption keys. With encryption increasing across multiple enterprise applications it became harder to easily manage the keys from the different enterprise cryptographic applications.  Better standards were needed to create uniform interfaces for the centralized encryption key manager.

Companies soon saw the benefits of adopting KMIP.  Both large and small organizations need their key management to work every time and need it to scale as their organization grows.  And while other work was done to address this issue, like OASIS EKMI, IEEE P1619.3,  and IETF Keyprov KMIP was designed to have a broader scope than it’s predecessors and give more comprehensive standards for the industry.


How Was KMIP Initially Received?

In 2010, KMIP debuted at RSA.  HP, IBM, and others demonstrated that their client programs using the KMIP version 1.0 protocol could “communicate securely with key management servers. The clients and servers [demonstrated] essential use cases such as generating cryptographic keys, locating existing keys, and retrieving, registering, and deleting keys.”

In 2011 at the RSA Conference major players like IBM, RSA, and HP demonstrated KMIP 1.0 compatibility with their client programs.  And again in 2012 and in 2013 even more companies like Thales, NetApp, and Townsend Security demonstrated KMIP compliance.  With all these prominent players becoming KMIP compatible, it was a major signal to the industry that KMIP was rapidly becoming the industry standard for interoperable communications for key managers.

How is KMIP Thought of Now?

Fast forward to 2014.  The The Storage Networking Industry Association (SNIA) announced a testing program for KMIP conformance for its members.  In their words, “By introducing the KMIP Test Program for the industry, we’re helping to encourage not only the adoption of enterprise–class key management, but a means for vendors to test for conformance and provide an assurance of interoperability and a layer of trust to their customers.”

At  OASIS’ Interoperability Showcase at RSA 2016 16 companies, including Townsend Security, demonstrated KMIP compatibility.  And with the likes of VMware, Oracle, Quantum, and many others  demonstrating KMIP compatibility, KMIP has become a dominant standard in key management interoperability.

Final Thoughts

Encryption is your last, best defense for data at rest.  But encryption is only as good as your key management.  If the key is exposed to hackers, the data is lost as well.  This is why key management standards like KMIP have already attracted considerable interest, and will continue to do so.  The ability to have a variety of vendor applications, platforms, and databases all able to communicate with a centralized key manager enhances the data security posture of the enterprise.  And this is what organizations should strive to achieve.

OASIS built the standard to address a broader scope of issues than what older industry standards addressed. But KMIP still is actively being matured by OASIS (we are on version 1.3) and we should expect to see further enhancements and revisions to the standard as well as broader industry adoption.  This should give us confidence that KMIP as a well-accepted, road-tested standard will continue to grow in industry popularity in years to come.

eBook: Definitive Guide to Encryption Key Management

Topics: Encryption Key Management

Hillary's email data breach taught us all the wrong lessons

Posted by Ken Mafli on Feb 28, 2017 9:11:00 AM

In an unprecedented October surprise, Wikileaks dumped thousands of emails onto the internet from the Democratic National Committee (DNC), most of them concerning Hillary Clinton’s presidential campaign.  Later, in defending this move, Wikileaks founder Julian Assange, in an interview with FOX News, “said a 14-year-old could have hacked into the emails of Hillary Clinton's campaign chairman,” reported the Daily Mail.  Assange later revealed in the interview that John Podesta’s, Hillary’s campaign chairman, password was 'password.'  Politifact has gone on to challenge that assertion, saying that “Podesta was using a Gmail account, and Google doesn’t allow users to make their passwords ‘password.’”

Whatever John Podesta’s password was, it has sparked a good deal of renewed interest in good password management.  And far be it from me to downplay this crucial bit of data security.  We still have a long way to go.  In fact, SplashData just completed their survey of over 5 million people’s passwords and found that over 10% of people still use the most commonly guessable passwords like:

  • password
  • 123456
  • qwerty
  • passw0rd
  • Password1
  • zaq1zaq1

If you use any of these, stop it. Now.

But if that is all that we learn from the hack and subsequent data breach, we have missed the lesson.  As far back as June of 2016, it was widely reported, by the likes of Brian Krebs and Jeremy Kirk, that the DNC was vulnerable to attacks do to systemic weaknesses in cybersecurity.  In fact, in Jeremy Kirk’s article, it was noted that a press assistant emailed everyone a new password after a recent breach (a strong password at that: 'HHQTevgHQ@z&8b6').  The irony is, some of the email accounts had been compromised.  The hackers needed only to open the email and use the new password.

Strong passwords are not enough to rebuff the efforts of hackers to gain entry and to render the data useless in case of a breach.  We need proven security measures in order to keep the data safe.  

The data security measures below reflect specific things you can do to secure your data-at-rest in general. While there are more more specific measures you can take for email servers, it is important to remember that organizations have sensitive data everywhere, not just in emails.  That being said, since even seemingly benign emails at the DNC can blow up into political controversy, they probably need to follow these along with more email specific recommendations.  Follow along to find some of the best methods your organization should be using today to better secure your data security posture.

Multi Factor Authorization

2FA.pngAs we have already mentioned, usernames and passwords, by themselves, are not enough to authenticate users.  Truly strong passwords are hard to manage and remember.  And once a system is compromised, login credentials can be scraped with keyloggers, malware, or other such attacks.

You need an external verification process.  You need multi factor authentication (MFA). MFA has traditionally relied on verifying you by two of three ways:

  • Something that you know (i.e.: username, password, challenge questions/responses, one-time-use code, etc.)
  • Something that you have (i.e.: token, RFID cards or key fobs,  mobile phones, etc.)
  • Something that you are (biometrics)

Each of these methods have their advantages and drawbacks. For example:

  • Challenge Questions:
    • PRO: do not require any physical equipment on the user side
    • CON: do rely on the user’s memory, which can be fuzzy when it comes to precisely writing the correct response
    • CON: are vulnerable to deduction through inspection of social media accounts, etc.
    • CON: are “something you know” and so fall into the same category as login credentials, thereby not taking advantage of any other kind of authentication
  • Physical Equipment: (like RFID cards and tokens)
    • PRO: do not rely on a person’s memory
    • CON: can be stolen or lost
    • CON: require active device management from an administrator

One method of authentication that is gaining ground because of its ease of use is authentication that relies on OAuth (an open standard for authorization).  It does not rely on physical fobs (which can be lost) or an SMS text (which can be intercepted).  It, instead, relies on cryptographic code that generates a time specific one-time-use codes based on the user’s secret key and the time. Since the code operates simultaneously (and separately) on the user’s device (typically a mobile phone) and on an internal server, with no need for an internet connection; it greatly reduces downtime because of internet issues and hackers intercepting the one-time-use code.

Encryption

lock.pngStrong, Advanced Encryption Standard (AES) encryption as put forward by NIST should be used to encrypt all sensitive customer and company data.  In 2001 NIST formally adopted the AES encryption algorithm.  Since then, it has been proven countless times to render the data useless in the event of a breach.  In fact, it would take the fastest supercomputer 375 x 1050 years to brute force AES encryption by running through all permutations of an AES 256-bit encryption key.  In comparison, the Sun will reach its Red Giant stage in 54 x 108 years, engulfing Mercury, Venus, and possibly Earth.  In other words, the Earth will be incinerated by the then rapidly expanding Sun before a hacker could effectively crack AES encryption through brute force.

The good news, AES encryption comes standard in most database’s native encryption libraries.  Along with those free versions, there are a number of commercial products that rely on AES encryption available.  So finding a way to secure your data with AES encryption will be fairly easy.  That being said, it is important to understand the development time and performance hits each solution takes. Native encryption libraries are generally free but take a bit of development time.  Commercial solutions take less time to deploy but many times are file/folder level encryption products and have performance hits because they take a longer to encrypt/decrypt than column level encryption products.

Centralized Encryption Key Management

key.pngAs we mentioned, AES encryption is extremely difficult to brute force attack.  It’s strength lies in its ability to encrypt the data with a very long key (typically 256-bit). But it’s strength is also its weakness.  If your encryption key becomes known to a bad actor, your encrypted data becomes compromised.  That is why any encryption strategy worth its salt will include proper, centralized encryption key management.  

When defending your encryption key with full lifecycle key management, consider these things:

  • The encryption keys should be logically or physically separated from the encrypted data.  This way, if the encrypted data is compromised, they will not be able to decipher it.
  • The encryption keys should only be generated with a cryptographically secure pseudo-random number generator (CSPRNG).
  • Restrict administrator and user access to the keys to the least amount of personnel possible.
  • Create clear separation of duties to prevent improper use of the keys by database administrators.
  • Manage the full lifecycle of the keys from key creation, activation, expiration, archive, and deletion.

For a more comprehensive view of encryption key management, please view the Definitive Guide to Encryption Key Management.

Real Time Log Monitoring

Forrester, in 2013, promulgated the cybersecurity model of “Zero Trust.”  In it, they put forward the motto: “never trust, always verify.”  By this, they mean that all users should be authenticated, restricted to the least amount of data possible, and verified that they are doing the right thing through real-time monitoring.  Of which, they advocate for:  

  • Real Time Event Collection in which you collect and log all events, in real time.
  • Event Correlation in which you analyze all events and narrow in on the ones that do not conform to expected patterns.
  • Resolution Management in which you investigate all suspect behavior and either classify them as either benign or a possible threat for further investigation.

There are many Security Information Event Management (SIEM) tools available that accomplish this.  For more information, refer to Gartner’s SIEM Magic Quadrant to find the tools that fit your needs.

Final Thoughts

Defending data-at-rest is a never ending struggle of building robust defenses and continuous improvement.  But, it's not a question of if, but when, a data breach will happen.  And if the DNC data breaches taught us anything is that breaches can be embarrassing and costly.  Since  hackers are only growing more sophisticated in their techniques, it is incumbent upon us to respond in ever increasing levels of agility and sophistication of our own.

The old models of the high, guarded perimeter with complex passwords to gain entry are just not enough.  We need a higher degree of authentication, sensitive data rendered useless, and constant real-time monitoring of all traffic.  You data depends on it.

Turning a Blind Eye to Data Security eBook

Topics: Data Security

Microsoft SQL Server Automatic Encryption - Cell Level Encryption

Posted by Patrick Townsend on Feb 21, 2017 9:11:00 AM

In this third part of the series on Microsoft SQL Server encryption we look at Cell Level Encryption, or CLE, which is Microsoft terminology for Column Level Encryption. With CLE the manner and timing of SQL Server’s call to the EKM Provider software is quite different than for Transparent Data Encryption. It is important to understand these differences in order to know when to use CLE or TDE. Let’s look at some aspects of the CLE implementation:

Encrypted Columns
Encryption-Key-Management-SQL-Server Cell Level Encryption is implemented at the column level in a SQL Server table. Only the column you specify for encryption is protected with strong encryption. You can specify more than one column for CLE in your tables, but care should be taken to avoid performance impacts of multiple column encryption (see below).

With Cell Level Encryption you may be able to minimize some of the encryption performance impacts on your SQL Server database. Because the EKM Provider is only called when the column must be encrypted or decrypted, you can reduce the encryption overhead with careful implementation of your database application code. If a SQL query does not reference an encrypted column, the EKM Provider will not be invoked to perform decryption. As an example, if you place the column Credit_Card under CLE encryption control, this query will not invoke the EKM Provider for decryption because the credit card number is not returned in the query result:

SELECT Customer_Number, Customer_Name, Customer_Address FROM Orders ORDERBY Customer_Name;

You can see that judicious use of SQL queries may reduce the need to encrypt and decrypt column data.

SQL Application Changes
Unlike Transparent Data Encryption you must make a change to the SQL statement in order to implement Cell Level Encryption. The SQL Server functions “encryptbykey” and “decryptbykey” are used on SQL statements. Here is an example of a SQL query that encrypts a CLE-encrypted column:

select encryptbykey(key_guid('my_key'), 'Hello World');

Implementing CLE encryption in your SQL Server database requires modifications to your applications, but may be well worth the additional work.

Encryption and Key Retrieval
The EKM Provider software is called for each column value to perform encryption and decryption. This means a larger number of calls to the EKM Provider compared to Transparent Data Encryption. Because the number of calls to the EKM Provider may be quite large it is important that the encryption and key management functions of the EKM Provider are highly optimized for performance (see the next section).

The EKM Provider software from your key management vendor is responsible for performing encryption of the data. From a compliance point of view it is important to understand the encryption algorithm used to protect data. Be sure that the EKM Provider software uses a standard like the Advanced Encryption Standard (AES) or other industry recognized standard for encryption. It is common to use 128-bit or 256-bit AES for protecting data at rest. Avoid EKM Providers which implement non-standard encryption algorithms.

Encryption Key Caching
When deploying CLE it is important that the EKM Provider software optimize both encryption and key management. The number of calls to the EKM Provider software can be quite high. Good EKM Providers will securely cache the symmetric key in the SQL Server context rather than retrieve a key on each call. The retrieval of an encryption key from a key server takes precious time and multiple calls to retrieve a key can have severe performance impacts. Secure key caching is important for CLE performance. The use of the Microsoft Windows Data Protection Application Program Interface (DPAPI) is commonly used to protect cached keys.

Performance Considerations
When properly implemented Cell Level Encryption can reduce the performance impact of encryption on your SQL Server database. For very large tables with a small number of columns under encryption control, the performance savings can be substantial. This is especially true if the column is used less frequently in your applications.

CLE Vendor Note
Note that each vendor of EKM Provider software implements encryption and key management differently. Some EKM Providers only implement Transparent Data Encryption (TDE). If you suspect you will need Cell Level Encryption be sure that your key management support includes this capability.

In the next part of this series we will look at encryption key management in SQL Server.

Patrick

Encryption and Key Management for Microsoft SQL Server

Topics: SQL Server, Cell Level Encryption, SQL Server encryption

Microsoft SQL Server Automatic Encryption - Transparent Data Encryption

Posted by Patrick Townsend on Feb 14, 2017 8:33:00 AM

In this second part of the series on Microsoft SQL Server encryption I want to focus on Transparent Database Encryption, or TDE. Most Microsoft customers who implement encryption in SQL Server use Transparent Data Encryption as it is the easiest to deploy. No code changes are required and enabling encryption requires just a few commands from the SQL Server console. Let’s look at some of the characteristics of TDE implementation.

Database Encryption
Download the Webinar - Just Click! TDE involves the encryption of the entire database space in SQL Server. There is no need or ability to select which tables or views are encrypted, all tables and views in a database are encrypted at rest (on disk). When data is read from disk (or any non-volatile storage) SQL Server decrypts the entire block making the data visible to the database engine. When data is inserted or updated the SQL Server database encrypts the entire block written to disk.

With TDE all of the data in your database is encrypted. This means that non-sensitive data is encrypted as well as sensitive data. There are advantages and disadvantages to this approach - you expend computing resources to encrypt data that may not be sensitive, but you also avoid mistakes in identifying sensitive data. By encrypting everything at rest you are also protected from expansion of regulatory rules about sensitive data protection.

sql-decrypts-db.png

Protection of the Symmetric Key
When you enable Transparent Data Encryption on your SQL Server database the database generates a symmetric encryption key and protects it using the EKM Provider software from your key management vendor. The EKM Provider software sends the symmetric key to the key server where it is encrypted with an asymmetric key. The encrypted database key is then stored locally on disk in the SQL Server context.

When you start a SQL Server instance the SQL Server database calls the EKM Provider software to decrypt the database symmetric key so that it can be used for encryption and decryption operations. The decrypted database key is stored in protected memory space and used by the database. The encrypted version of the database key remains on disk. In the event the system terminates abnormally, the only version of the database key is the encrypted version on disk.

Starting the SQL Server Instance
During normal operation of SQL Server there is no invocation of the EKM Provider software and therefore no communication with an external key manager. Every normal restart of the SQL Server database instance will cause the EKM Provider software to be called to unlock the database key on the key server.

It should be noted that it is the responsibility of the EKM Provider software to handle network or key server failure conditions. SQL Server itself has no visibility on the connection to an encryption key management solution. If the EKM Provider software is unable to retrieve an encryption key, the SQL Server start request will fail. We will discuss business continuity issues in more detail later in this series.

Protecting Database Logs
SQL Server logs may contain sensitive data and therefore must also be encrypted. Transparent Database Encryption addresses this by fully encrypting database logs along with the database itself. It is important to remember that encryption of the logs will only start after TDE is activated AND after you stop and restart the database log. If you neglect to restart logging sensitive data may be exposed in the SQL Server log files.

Table and Index Scanning
Certain SQL operations on indexes require that the SQL Server database have visibility on the entire index of a column. An example of a SELECT statement would be something like this:

SELECT Customer_Name, Customer_Address FROM Orders WHERE Credit_Card=’4111111111111111’;

To satisfy this SQL query the database must inspect every row in the table Orders. With TDE this means that the column Credit_Card must be decrypted in every row. Similar operations with the ORDERBY clause can cause table or index scans.

Performance Considerations
Transparent Data Encryption is very optimized for encryption and decryption tasks and will perform well for the majority of database implementations. Microsoft estimates the performance impact of TDE of 2% to 4% and we find this accurate for most of our customers. However, Microsoft SQL Server customers with very large SQL Server databases should use caution when implementing TDE. Be sure that you fully understand the impact of TDE on your application use of large tables. It is always recommended that you perform a proof-of-concept project on very large databases to fully assess the performance impact of encryption.

In the next part of this series we will look at the other option for SQL Server encryption - Cell Level Encryption, also called column level encryption.

Patrick

Encryption and key management for SQL Server

Topics: SQL Server, Transparent Data Encryption (TDE)

Microsoft SQL Server Encryption - An Introduction

Posted by Patrick Townsend on Feb 6, 2017 2:33:16 PM

In 2008 the Payment Card Industry Data Security Standard (PCI-DSS) was gaining serious traction and Microsoft released SQL Server 2008 with built-in support for encryption. This was no coincidence. In addition to the PCI standard which mandated encryption of credit card numbers, numerous states in the US had also adopted data breach notification laws with strong recommendations for encryption. The compliance environment was changing dramatically and the SQL Server group at Microsoft provided a path to meet those new compliance regulations. This was a prescient and crucially important enhancement for Microsoft customers - the security threats have increased over time and compliance regulations have become more stringent.

Download the Webinar - Just Click! In this multi-part series I want to talk about how Microsoft implemented encryption in SQL Server, how you can leverage this capability to achieve better security and compliance, and the critical issues involved in getting encryption right with SQL Server. I hope you will find this series helpful as you negotiate your SQL Server encryption projects.

Architecture
Many Microsoft applications and services implement a “Provider” interface. This is the term that Microsoft uses to describe a standardardized, pluggable architecture for third party software companies to integrate and extend the capabilities of Microsoft solutions. With Provider architectures Microsoft enables a method for third parties to register their software to the Microsoft application, and the Microsoft application will then call that software as needed. The third party software must obey rules about the data interface and behavior of their applications. If done correctly the Provider interface provides powerful extensions to Microsoft applications.

Starting with SQL Server 2008 the database implements a Provider interface for encryption and key management. This is named the “Extensible Key Management” Provider interface, or the “EKM Provider”. EKM Provider software performs encryption and key management tasks as an extension to the SQL Server database. The EKM Provider architecture opened the door for third party key management vendors to extend encryption to include proper encryption key management.

From a high level point of view the EKM architecture looks like this:

SQL-EKM.png

Every version of SQL Server since 2008 has fully implemented the EKM Provider architecture. This has provided a stable and predictable interface for Microsoft customers and encryption key management vendors.

EKM Architecture - Column and Database Encryption
The EKM Provider architecture supports two different methods of database encryption:

  • Cell Level Encryption
  • Transparent Database Encryption

Cell level encryption is also known as column level encryption. As its name implies it encrypts data in a column in a table. When a new row is inserted into a table, or when a column in a row is updated, the SQL Server database calls the EKM Provider software to perform encryption. When a column is retrieved from the database through a SQL SELECT or other statement the EKM Provider software is called to perform decryption. The EKM Provider software is responsible for both encryption and key management activity. Implementing cell level encryption requires minor changes to the SQL column definition.

Transparent Database Encryption, or TDE, provides encryption for the entire database and associated log files. All tables and views in the database are fully encrypted. Data is encrypted and decrypted as information is inserted, updated, and retrieved by users and applications. As its name implies, transparent data encryption requires no changes to applications, SQL definitions, or queries. The database works seamlessly after encryption is enabled.

Transparent Data Encryption is the easiest of the two encryption methods to implement. In a following part of these series I will discuss when it makes sense to use TDE and when Cell Level Encryption is a better choice.

Activating the EKM Provider
After installing the EKM Provider software from a third party, the SQL Server database administrator uses the SQL Server management console to activate the EKM Provider and place the database or columns under encryption control. The activation of the EKM Provider software causes the database to be immediately encrypted and all further data operations on the database will invoke the EKM Provider software.

Microsoft EKM Provider for Locally Stored Encryption Keys
Recognizing that some SQL Server customers wanted to encrypt data but did not have the resources or time to implement a key management solution, Microsoft provided a built-in EKM Provider that performs encryption but which stores encryption keys locally in the SQL Server context. Understanding that this was not a security best practice, Microsoft recommends that customers use a proper encryption key management solution that separates encryption keys from the SQL Server database. That was good advice - locally stored encryption keys can be recovered by cyber criminals and the use of external key management systems provides better security and compliance.

EKM Provider Software
EKM Provider software is usually provided by your encryption key management vendor. This means that the features and functions of the EKM Provider software can vary a great deal from one vendor to another. Be sure that you fully understand the architecture and capabilities of the EKM Provider before you deploy SQL Server encryption.

SQL Server Versions That Support EKM
EKM Provider support is available in all Enterprise editions of SQL Server including Data Warehouse and Business Intelligence editions. EKM provider support is not available in Standard, Web, or Express editions of SQL Server.

In the following series I will go into more detail on the EKM Provider interface, transparent data encryption, cell level encryption, business continuity, compliance, and other topics.

Patrick

Encryption and key management for SQL Server

Topics: Encryption, SQL Server

Three Core Concepts from "Zero Trust" to Implement Today

Posted by Ken Mafli on Feb 1, 2017 12:57:58 PM

 

“There are only two types of data that exist in your organization: data that someone wants to steal and everything else.”

Forrester Research

VMware Encryption Key Management PCI In 2013, Forrester released an outline of their proprietary “Zero Trust Model” of information security to The National Institute of Standards and Technology (NIST).  Their model seeks to change “the way that organizations think about cybersecurity,” execute on higher levels of data security, and all the while “allowing for free interactions internally.”

But, when looking to better secure your organization’s data security posture, it is good to start with what has changed.  In the report, Forrester concluded that the old network security model was that of “an M&M, with a hard crunchy outside and a soft chewy center.”  It is the idea of the hardened perimeter around the traditional, trusted datacenter.  This old model is fraught with vulnerabilities as the traditional model is not equipped to handle new attack vectors with IoT, workforce mobility, and data centers moving to the cloud. It is increasingly becoming outmoded and weak.

In it’s place must come a data security model that takes into account the current network landscape and its vulnerabilities.  Enter, Zero Trust.  It builds upon the notion of network segmentation and offers key updates all under the banner: "never trust, always verify."

Below are the three main concepts to Zero Trust.  Follow along as we break down the trusted/untrusted network model and in its place rebuild a new trust model.

 

Assume All Traffic is a Threat

The first rule of “never trust, always verify” is that all traffic within the network should be considered a potential threat until you have verified “that the traffic is authorized … and secured.” Let’s look at these two components:

  • Authorized Traffic: Each end user should present valid (and up-to-date) login credentials (i.e. username and password) as well as authenticate themselves with multi factor authentication for each session logging into the network.  Usernames and passwords are not enough.  Only multi-factor authentication can reduce the risk of a hacker obtaining and misusing stolen login credentials.
  • Secured Traffic: All communication, coming from inside and outside of the network, should be be encrypted.  It should always be assumed that someone is listening in.  Using SSH or TLS and keeping abreast of their potential vulnerabilities is the only way to reduce the risk of exposure.

 

Give Minimal Privileges

The only way to minimize the risk of employees, contractors, or external bad actors misusing data is to limit the access each user/role is given to the least amount of privileges possible.  With this, it is a forgone conclusion that all sensitive data is already encrypted and minimal privileges are given as to who can decrypt it.  We implement a minimal privileges policy so that “by default we help eliminate the human temptation for people to access restricted resources” and the ability for hackers to access a user’s login credentials and thereby have access to the entire network.

Role-based access control (RBAC) model, first formalized by David Ferraiolo and Richard Kuhn in 1992 and then updated under a more unified approach by Ravi Sandhu, David Ferraiolo, and Richard Kuhn in 2000 is the standard today.  It’s ability to restrict system access only to authorized roles/users makes it the ideal candidate for implementing this leg of Zero Trust.  While Zero Trust does not explicitly endorse RBAC, it is best game in town, as of today.  For a deeper dive, visit NIST’s PDF of the model.

 

Verify People are Doing the Right Thing

Once we have authenticated each user and restricted them to the least amount of data possible to adequately do their job, the last thing to do is “verify that they are doing the right thing” through logging and inspection.

Here is a short (and certainly not exhaustive) list of techniques used to inspect all events happening in your network.  

  • Real Time Event Collection: the first step is to collect and log all events, in real time.
  • Event Correlation: Next you need to analyze all of the events and narrowing in on the events that need greater scrutiny.
  • Anomaly Detection: In a related move, you will want to identify the events that do not conform to the expected pattern and investigate further.
  • Resolution Management: All events that do not meet the expected pattern should be investigated and either classified as benign or deemed a possible threat and given for further investigation.

Note: There are many tools available that accomplish these.  Please refer to Gartner’s Security Information Event Management (SIEM) Magic Quadrant to find the tools that may interest you.

 

Final Thoughts

It's not a question of if, but when, a data breach will happen. Hackers grow more sophisticated in their attacks and threaten everything from intellectual property to financial information to your customers Personally Identifiable Information (PII).  The old model of the high, guarded perimeter with the trusted, internal network no longer functions as a secure model.  Zero Trust offers a more comprehensive approach to today’s data security needs.  As you look to deploy this model, begin to seek out tools that will help you.  Here is a short list of some of the tools to consider:

  • Log Collection Tools: Some platforms, like the IBM i, have proprietary formats, that are difficult for SIEMs to read.  Make sure your SIEM can fully collect all needed logs.  If it cannot, find or make a tool that will properly capture and send the logs onto your SIEM.
  • SIEM Tools:  As mentioned earlier in the article, there are many good SIEM tools out there to help you collect, analyse, and monitor all events on your network.
  • Encryption (data-in-flight): Fortunately, there are many open source protocols for secure communications like SSH and TLS.
  • Encryption (data-at-rest): Advanced Encryption Standard (AES) encryption is ubiquitous in most platform’s native encryption libraries.  There are also a number of products that offer column level to folder/file level encryption.
  • Centralized Key Management: The encryption you deploy is only as good and the level of protection you give to the encryption keys.  Therefore, robust encryption key management is a must.
  • User Access Management: Managing privileges, credentials, and multi factor authentication can be a daunting task.  The more more you can automate this, the better.

In many cases, adopting this approach will not be about bolting on a few products onto your existing data security framework but completely renovating it.  Don’t let expediency force you to defend your data with only half measures.  Take a deep dive into Zero Trust’s approach and see where you may be vulnerable.

 

The Encryption Guide eBook

Topics: Data Security

The Future of Active Security Monitoring on the IBM i

Posted by Luke Probasco on Jan 24, 2017 8:19:21 AM

Active monitoring is one of the most effective security controls an enterprise can deploy. In fact, a large majority of security breaches occur on systems that have been compromised days, weeks, or even months before sensitive data is lost. A recent Verizon Data Breach Investigations Report indicates that a full 84 percent of all breaches were detected in system logs.  By actively collecting security logs in real-time, organizations can not only monitor security events, but also prevent a data breach before it starts.  I recently sat down with Patrick Townsend, to discuss log collection and active monitoring on the IBM i.

Hi Patrick, can you give our readers an overview on the importance of collecting and monitoring security logs on the IBM i?

The Future of Active Security Monitoring on the IBM i One of the most effective things that you can do to prevent a data breach is to deploy an active monitoring solution, sometimes also known as system logging.  You’ll find active monitoring at the top of all cyber-security lists of things to do – because it is effective.  Active monitoring is key to a strong security posture, for anybody.

Today, we all know that there is no longer a true perimeter and that our systems are at risk.  Luckily, active monitoring can help.  Here are are some key principles that organizations need to understand.  First, an active monitoring solution needs to involve a log collection server or SIEM solution (IBM Security QRadar, Splunk, LogRythm, etc.) to collect security events across the entire enterprise and actively detect threats.  Second, there needs to be real-time collection and monitoring of security events.  Rather than scooping up the security events once or twice a day, it is imperative to be collecting these events in real-time. When you collect logs across the entire enterprise, a SIEM can provide a lot of intelligence to identify patterns and anomalies – which will identify a potential attack.  The final critical components are good reporting, query, and forensics tools.  SIEM solutions also give you the ability to quickly run reports and analyze suspect data.  This is important for two reasons.  If you are having an attack you need to identify quickly where the attack is originating and how it is happening.  This is essential in order to know how to remediate it.  If you aren’t able to pinpoint the problem, it is very likely that you are going to be attacked by the same methods again.

Switching gears, the serious points for an IBM i customer revolve around the fact that the IBM i is a critical back-office processor for most customers and runs multiple applications.  Too often the IBM i is an island within an organization, but it is important that it is fully integrated in your enterprise’s entire infrastructure security strategy.

Also, it is generally true that a cyber-attack almost never starts on an IBM i server.  They typically start on a compromised user PC or someplace in the organization.  From there, a hacker spends a fair amount of time probing around the IBM i finding any weak points.  We shouldn’t be naïve – hackers know about IBM i servers.  They know what to look for, they know the user IDs, they know how to compromise these systems – they are very good at it.

IBM introduced some new security event sources in V7R3.  Can you talk a bit about those? And what events should an IBM i customer be collecting?

Every release of the IBM i server has had new security events and fields to collect and monitor.  At Townsend Security we work very hard to stay ahead of these releases so that our customers are well positioned to handle new information and use it for protection.  A couple examples include IPV6 address support and new fields in existing events.  Regarding the recent V7R3 release, new sources include:

  • QAUDLVL (Auditing level) system value
  • *NETSECURE (to audit secure network connections)
  • *NETTELSVR (to audit Telnet connections)
  • *NETUDP (to audit UDP connections)

To address the second part of your question, when you deploy an active monitoring solution on the IBM i, you are certainly going to want to collect events from QAUDJRN, QHST, QSYSOPR, as well as exit points.  Interestingly, the QAUDJRN security audit journal does not exist when you first install a new IBM i server. You must create the journal receivers and the journal to start the process of security event collection.

Aside from the new log sources that IBM introduced in V7R3, for someone who maybe deployed a logging solution a few years ago, what should they be aware of now?

First, let’s take a look at how compliance regulations have been evolving.  We now know that most attacks work on the basis of privilege escalation.  For example, an attacker gets access to our systems and then eventually gets sufficient authority to steal data. Because of this, we are seeing that it is more important to identify when an administrative level or highly privileged user logs in to our system.  This is an example of how a logging solution needs to evolve to meet current compliance requirements. Businesses are now required to log and monitor that activity.

Unfortunately, this can be particularly hard on the IBM i.  On first look, an IBM i account may appear to have normal user privileges, but may in fact inherit higher privileges through a Group Profile or Supplemental Group Profile. It is important to detect these elevated privileges in real time and provide the security administrator with an easy-to-use report to identify the source of elevated privileges. This is an excellent example of how logging solutions need to evolve with the ways security events are monitored.  We recently tackled this in the latest release of our Alliance LogAgent.

Where do you see the future of logging on the IBM i?

Let me dust off my crystal ball!  First off, File Integrity Monitoring (FIM) will become more important.  To maintain a strong posture, security administrators need to know who is accessing sensitive data and system values on the IBM i.  We’re also going to see more requirements around File Integrity Monitoring across the regulatory compliance environments.  Why?  Because, as we discussed earlier, cyber-attackers escalate privileges, access sensitive data, and change security configurations in order to get the work done that they want to do.  Again, this is why we are seeing increased requirements in regulations like the Payment Card Industry Data Security Standard (PCI DSS) and new financial services regulations.

Another interesting prediction:  It won’t be unheard of for organizations to use multiple SIEM solutions. We are starting to see businesses use one SIEM for traditional security monitoring and another to monitor operational data.  Operational data, you ask?  Sure.  Logging solutions can easily allow administrators to answer operational questions like: How full are my disks?  Do I have any critical hardware errors?  Second, they can benefit from deploying a SIEM to monitor application data.  Sales teams, for example, can track inventory status, trending products, etc.  The benefits of file monitoring don’t have to be exclusive to security.

In the near future, we will also see a pickup of integration with Artificial Intelligence (AI), also commonly referred to as cognitive computing.  IBM has the Watson platform, and there are others, which I believe will be used to enhance security.  We are already seeing initial efforts in this respect.  Harnessing that AI capability with security makes total sense.  

Finally, as we are seeing, everything not bolted down is going to the cloud.  We will definitely see an evolution of new cloud services around security and logging.  It may take a little time for vendors to start leveraging that, but I believe it is definitely in the works.

To hear this interview in it’s entirety, download our podcast “The Future of Security Logging on the IBM i” and hear Patrick Townsend, founder and CEO of Townsend Security, further discuss log collection and monitoring on the IBM i, new log sources in V7R3, and the future of security logging on the IBM i.

The Future of Active Security Monitoring on the IBM i

Topics: System Logging, Alliance LogAgent

Fixing the TDE Key Management Problem in Microsoft SQL Server

Posted by Patrick Townsend on Jan 10, 2017 7:31:56 AM

Many Microsoft SQL Server users have taken the first step to protect sensitive data such as Personally Identifiable Information (PII), Protected Health Information (PHI), Primary Account numbers (PAN) and Non-Public Information (NPI) by encrypting their databases with Transparent Data Encryption (TDE). It is extremely easy to implement TDE encryption as it does not require program changes.

Download the Webinar - Just Click! A common cause of audit failures might not be so obvious and that is the failure to properly protect the SQL Server key encryption key once you activate encryption in SQL Server. With Transparent Data Encryption you have the choice of storing the service master key within the SQL Server context itself, or protecting the master key with a key management system using the SQL Server Extensible Key Management (EKM) interface. Why is it important to do this?

It turns out that it is easy for cyber criminals to recover the SQL Server master key when it is stored within SQL Server itself. (Examples: https://blog.netspi.com/decrypting-mssql-credential-passwords/ and http://simonmcauliffe.com/technology/tde/#hardware)

Simon McAuliffe provides the clearest explanation I’ve seen on the insecurity of locally stored TDE keys in SQL Server. I don’t agree with him on the question of using a key manager to improve security. Given that there is no perfect security, I believe that you can get significant security advantages through a properly implemented key management interface.

If your TDE keys are stored locally, don’t panic. It turns out to be very easy to migrate to a key management solution. Assuming you’ve installed our SQL Server EKM Provider called Key Connection on your SQL Server instance, here are the steps to migrate your Service Master Key to key management protection using our Alliance Key Manager solution. You don’t even need to bring down SQL server to do this (from the Alliance Key Manager Key Connection manual):

Protecting an existing TDE key with Alliance Key Manager

First create a new asymmetric key pair within the AKM Administrative Console using the “Create EKM Key” and the “Enable Key for EKM” commands.

Then return to SQL Server and call the following command to create the asymmetric key alias for the new KEK that you created on the AKM server:

use master;

create asymmetric key my_new_kek from provider KeyConnection with provider_key_name = ’NEW_TDE_KEK’, creation_disposition = open_existing;

In this example, NEW_TDE_KEK is the name of the new key on AKM, and my_new_kek is the key alias.

Then use the ALTER DATABASE statement to re-encrypt the DEK with the new KEK alias assigned in the previous statement:

ALTER DATABASE ENCRYPTION KEY

ENCRYPTION BY SERVER

   {  ASYMMETRIC KEY my_new_kek}

Note that you do not have to take the database offline to perform this action.

Of course, there are other steps that you should take to secure your environment, but I wanted to demonstrate how easy it is to make the change.

The SQL Server DBA and the network administrator will have lots of other considerations in relation to SQL Server encryption. This includes support for clustering and high availability, automatic failover to secondary key servers, adequate support for separation of duties (SOD) and compliance, and the security of the credentials needed to validate SQL Server to the key manager. All of these concerns need to be addressed in a key management deployment.

For SQL Server users who deploy within a VMware or cloud infrastructure (AWS, Azure), Alliance Key Manager can run natively in your environment, too. It does not require a hardware security module (HSM) to achieve good key management with SQL Server. You have lots of choices in how you deploy your key management solution.

It turns out not to be difficult at all to address your SQL Server encryption key insecurities!

Patrick

Encryption and key management for SQL Server

Topics: SQL Server, Transparent Data Encryption (TDE)


Subscribe to Email Updates

Posts by Topic

see all