Privacy of Personally Identifiable Information under State Law
Just as legislatures should stay away from writing technical data security specifications, regulatory authorities should shy away too. An example of an unhelpful technical regulation comes from the well-meaning Massachusetts Office of Consumer Affairs and Business Regulation. It published regulations on the protection of personal information, 201 CMR 17.00. Section 17.04(5) requires "encryption" of personally identifiable data on laptops and iPads.
But 17.02 defines "encrypted," as "the transformation of data through the use of an algorithmic process, or an alternative method at least as secure, into a form in which meaning cannot be assigned without the use of a confidential process or key . . . " Hmm. So under this regulation what does the word "encryption" mean in practice?
"Encryption" seems to include the transformation of data by some means that is at least as good as an algorithm. But which algorithm? The regulation does not really say. Some algorithms are very easy to break. Others are less easy. Few if any commercially useful algorithms are impossible to break.
Would it be reasonable to interpret the 17.02 to allow processes that are easy to break? Maybe not. 17.02 requires the process to transform data “into a form in which meaning cannot be assigned without the use of a confidential process or key”. An easily breakable algorithm does not satisfy the cannot requirement.
OK. So it seems 17.02 excludes an easily breakable algorithm. Next question: What about a algorithm that is hard to break, but not literally impossible to break? Many reasonably good algorithms can eventually be broken if (for example) enough brute force computing power is applied for a long period of time. But if 17.02's word cannot is read literally, then the hard-to-break algorithm would be excluded too. But such a literal reading of the regulation would seem unreasonable because few if any commercially available algorithms are literally impossible to break forever.
Hence, it seems 17.02 requires hard-to-break encryption – but not impossible to break encryption – anytime private data are stored on a laptop. [If in fact that is what the drafters of the regulation mean, then why don’t they explicitly say that?]
So now that we think we better understand the regulation, let’s think more about the technology of encryption. Smart people are constantly seeking spectacular new ways to break good encryption. And every so often they succeed. For example, Wired Equivalent Privacy or WEP encryption was broken a few years after it came into wide use.
Given that strong encryption is proven from time to time to be weak, encryption users have to upgrade their technology every so often. When they hear that their current encryption has been broken, they shift to something else. Section 17.02 could reasonably be read to require this upgrading process.
Assuming 17.02 does require periodic upgrading, please consider this scenario: A Massachusetts government agency stores private data on numerous laptops. To comply with 2001 CMR 17.00, the agency implements encryption method X to protect the data. At the time of implementation, method X has a reputation for being good.
As time passes, a lawsuit arises, and the data on the laptops might be relevant to the lawsuit. The agency therefore implements a litigation hold on the data on the laptops, so as to avoid destroying any evidence while the lawsuit is pending.
A lawsuit can take years to conclude. During the pendency of this lawsuit, let's say the agency de-commissions the laptops. Its employees no longer use the laptops. But the agency cannot destroy the data on the laptops on account of the litigation hold. So it stores the laptops in a well-secured warehouse.
More time passes. It becomes widely known in the encryption community that method X is lousy (like WEP); it is breakable. Must the agency now go to the expense of upgrading the encryption on the de-commissioned, physically-secure laptops? Massachusetts regulation 201 CMR 17.00 seems to require such a (senseless) upgrade, for the regulation seems inflexible. The regulation fails to provide discretion to users.
Lesson: Better-written laws just set goals, and let users apply all the methods at their disposal to reach those goals. Unlike 201 CRM 17.00, better laws avoid specifying particular technologies for advancing civil rights like privacy.
Update: Reacting to public criticism, Massachusetts has revised proposed 201 CMR 17.00 many times since first publication. Last I heard, the effective date of the latest version of the regulation is March 1, 2010.
Mr. Wright is an advisor to Messaging Architects, thought leader in data records management.