Micro Focus is now part of OpenText. Learn more >

You are here

You are here

A software engineer's guide to encryption: How not to fail

public://pictures/lwmartin.jpg
Luther Martin Distinguished Technologist, Micro Focus
 

Encryption is a difficult and tricky topic. It comes with its own set of incomprehensible abbreviations and acronyms, and understanding the details of how it works requires a few years of graduate-level mathematics.

While there is an easy way to help select encryption technology that's suitable for most business uses, that approach may not quite give you what you expect, and is a good example of why being regulatory-compliant and being secure are not the same thing. Today, the most important criterion is to use encryption that has been validated against the US government’s Security Requirements for Cryptographic Modules standard, also known as FIPS 140-2. Understanding why this makes sense requires a digression into economics.

Lending credence to encryption

Economists divide goods into three types—search goods, experience goods, and credence goods—whose differences depend on how easy it is for consumers to verify their properties.

Search goods have properties that are easy to check before you consume them. If you are in the market for a red car, for example, it is easy to check if a potential purchase is really red. Very few, if any, information security products fall into this category.

Experience goods have properties that are not obvious before you consume them but that are easy to verify after you consume them. If you are looking for a car with a certain fuel efficiency, perhaps getting at least 35 miles per gallon under your typical driving conditions, you cannot tell this just by looking at the car itself (although this is why laws mandate that this information be provided to consumers), but you can easily test it. 

Many security products are probably experience goods. You cannot tell before you deploy it whether or not antivirus software or an intrusion detection system (IDS) will really protect your network, for example, but you can observe warning messages and review the logs of the products after they have been deployed to verify that they are actually working.

Credence goods have properties that cannot easily be checked, either before or after they are consumed. Organically grown produce and meat from animals raised in humane conditions are examples of credence goods; it is very difficult to verify these particular properties, even after you consume goods that have them. Many medicines are credence goods, because it is difficult to tell if your recovery was really due to the medication, a placebo effect, or even simply your body recovering on its own.

Products that implement encryption are probably credence goods. It takes expensive and uncommon skills to verify that data is really being protected by the use of encryption, and most people cannot easily distinguish between very weak and very strong encryption. Even after you use encryption, you are never quite sure that it is protecting you.

It is always possible that a clever adversary could develop an attack that lets him defeat the encryption that you are using, and he could then carry out this attack, perhaps reading encrypted messages, and you would have absolutely no idea that he was doing it.

Products cannot always be classified as purely search goods, experience goods, or credence goods, and real products often have aspects of each category. Cars have some search characteristics, such as color, and some experience characteristics, such as fuel efficiency.

Similarly, information security products can have aspects of more than one category. You can easily review the product's logs to verify that a deployed IDS is stopping some attacks on your network, so it has some experience characteristics. At the same time, the trade-off between Type I and Type II errors that you need to make for an IDS means that a deployed IDS is probably also missing some attacks on your network, so you won't be aware of those. 

The fact that the rate of missed attacks may be acceptably low, although you cannot actually verify it, also gives IDSs some credence characteristics.

On the other hand, cryptographic products seem to have many characteristics of credence goods and few characteristics of other types. You certainly cannot tell before you test it that such a product will operate as advertised, so there are probably no characteristics of search goods in these products. And because it is expensive and difficult to verify that the encryption provides strong protection to information or that a digital signature is really difficult to forge, even after it is used, cryptographic products show more characteristics of credence goods than of experience goods.

The uncertainty about quality associated with credence goods can lead to unusual results, such as prices that are lower than expected and fairly uniform, even in the face of significant quality differences.

If consumers of a product cannot easily distinguish between high- and low-quality goods, even after they have consumed the product, then you should expect that vendors cannot easily differentiate their products from competing ones. In this case, you should expect prices of competing products to be roughly the same.

Consumers will not be aware of the deficiencies in low-quality products, so producers of low-quality products will tend to overcharge for them. Similarly, competitive pressures will keep down the price of high-quality products. George Akerlof first described this situation in 1970 in his classic paper “The Market for ‘Lemons’: Quality Uncertainty and the Market Mechanism” and eventually won the Nobel Prize for Economics in 2001 for his work in this area.

In the worst of these situations, the low-quality products will actually drive the high-quality products from the market, as vendors of the high-quality products refuse to sell their products at the low price that the market forces upon them.

FIPS 140-2: Ignore at your own risk

Standards such as FIPS 140-2 are designed to avoid such market failures and provide an indicator to customers that they are buying high-quality encryption.

And because it is also hard for auditors to tell if any particular form of encryption is good or not, many of them simply require that encryption follow the FIPS 140-2 standard. The HIPAA Breach Notification Rule, for example, explicitly gives safe harbor from data breach-reporting requirements only when healthcare information has been encrypted with FIPS 140-2-validated encryption.

Following FIPS 140-2 is also strongly recommended in other industries, such as the financial services payments industry, where the PCI Security Standards Council’s requirements for point-to-point encryption, as defined in “Payment Card Industry (PCI) Point-to-Point Encryption Solution Requirements and Testing Procedures,” explicitly allows the use of encryption technologies that have been FIPS 140-2-validated and requires an additional certification process for those that have not.

In the federal government, complying with FIPS 140-2 is even more strongly required. From the point of view of the government’s auditors (inspectors general), sensitive data that has been encrypted with a method that isn't FIPS 140-2-validated is treated as though it were not encrypted at all.

That is, federal agencies can be using encryption technology that is perfectly secure, but if the technology is not FIPS 140-2-validated, the encrypted data is treated as if it has no protection. This means that it is possible for a federal agency to be using perfectly secure encryption, but for its auditors to decide that the agency has suffered a data breach by exposing sensitive information on the Internet.

This may sound a bit draconian, but the reason for this rigid stance should now be clear: Because encryption technology is a credence good, it is very hard to tell if it is effective or not, and the strict adherence to FIPS 140-2 is a way for the government to deal with this problem.    

The limits of FIPS 140-2

FIPS 140-2 is a fixed set of requirements against which to validate encryption technology. The requirements are intended to indicate that a particular implementation of encryption technology is secure. But does that mean other technologies are insecure? Not necessarily. Only a small number of encryption algorithms can be used under FIPS 140-2, and it takes the government many years to add a new algorithm to the FIPS 140-2-approved list. So the fact that a particular encryption algorithm is not approved for use under FIPS 140-2 does not necessarily mean that it is not secure. It may just mean that the algorithm has not yet been approved for use under FIPS 140-2.

For at least the last decade, academic papers on encryption have contained rigorous mathematical proofs that the approaches are secure. Such mathematical proofs offer strong evidence that these new techniques are truly secure.

Before such proofs of security were used, the fact that no other cryptographer could find a weakness after several years of testing was the best evidence that an algorithm was probably secure. But this process is much more prone to error than the rigorous, mathematical approach. Because such proofs are still inaccessible to nonspecialists, however, it is not clear that they are useful for typical users of encryption. This leaves these users unable to accurately judge whether or not a new approach to encryption (a credence good) is secure.

Simply following an established standard like FIPS 140-2 is the practical solution to this problem.

It's the one criterion that matters most

There are many things to consider when selecting what type of encryption you should use and how to deploy it effectively to protect your sensitive data, but one criterion stands out as more important than all of the others: whether or not a given technology has been validated against the US government’s FIPS 140-2 standard. If you select encryption that meets this criterion, it will almost certainly be acceptable to your auditors, which is just as important as any other aspect of the technology.

Image credit: Flickr

Keep learning

Read more articles about: SecurityData Security