Data Leakage Law
The key to avoiding liability for data leakage is due diligence. As a broad generalization, the law of data security rewards those who are diligent, who exercise due care to prevent a compromise of privacy. Hence, if a data holder is diligent, but still suffers a mishap, it is less likely to be held legally liable when a data subject decides to sue.
For example, Guin v. Brazos Higher Education held a loan processor was not liable for a compromise of data security, in part because the processor had taken reasonable steps to protect the data (including a written security policy and risk assessment).
Reasonable steps, or due diligence, can include application of latest technology, such as filters that inspect outgoing data transmissions and scrub or block those that appear suspicious. An e-mail filter would have been helpful to the Palm Beach County health department when an employee inadvertently broadcast a list of HIV/AIDS patients to 800 county employees.
When employing such filters, however, an issue is to know what to filter. Obvious targets for filtering are Social Security Numbers and credit card numbers. But privacy is a context-specific abstraction. To understand which data are and are not private, organizations have to be sensitive. For example, a person’s name plus postal address are normally not considered private. That information is commonly published in directories like telephone books. However, as one of my SANS Institute students taught me, a public housing authority is wise to consider the name plus postal address of its residents to be private. The reason is that some residents are sensitive about living in public housing, and consider their postal address (at a public housing location) to be private!
Update: Netflix (the movie rental company) learned how context-specific PII can be. Netflix gave "anonymized" data about the viewing interests of its users to researchers so they could help develop better algorithms for suggesting to users new movies they'd like to see. But academics have shown that most any information about a person (such as movies watched) can be used to identify the person if it is matched with enough context (data on time, location and so on are good examples of context that can turn anonymized data into PII). So Netflix’s research project drew a lawsuit saying the data it exposed to researchers was actually PII. The project also attracted attention from government watchdog Federal Trade Commission. Netflix settled the lawsuit and ceased that particular line of research.
--Benjamin Wright, computer security law instructor for SANS Institute (CLE and CPE).