Since the early days of data privacy, the principle of purpose limitation has been a core component of practice. It is, after all, one of the original fair information practice principles. As an enforcement hook, however, purpose limitation has generally been too amorphous to lead to meaningful changes in data processing.
Way back in 1980, the Organisation for Economic Co-operation and Development's guidelines beseeched us to limit data processing to the purposes for which the data was collected — plus others "not incompatible with those purposes."
But compatibility, without further refinement, is a truck-sized hole in the principle of purpose limitation.
This is especially true if purposes are not narrowly determined. If an organization's purpose of processing data is to generally deliver the service of an online web application, the compatible purposes are limited only by the imaginations of its lawyers.
Though it did not close this hole, the EU General Data Protection Regulation certainly added an obstacle course around which the truck must navigate to deliver new purposes of processing. Before engaging in a new type of processing not based on consent, GDPR Article 6(4) outlines a five-part test for compatibility which we should "take into account." It is an open-ended list, marked with the roguish phrase "inter alia," inviting us to also take into account any number of other unlisted factors besides these five:
- A link between the new and original purpose.
- The context of collection, "in particular regarding the relationship between data subjects and the controller."
- The nature and sensitivity of the personal data.
- The possible consequences of the new processing to data subjects.
- Appropriate technical safeguards.
Though ostensibly a guiding principle of U.S. data privacy too, purpose limitations have historically been refracted through the lens of consumer protection law.
In practice, what was visible on the other side was a deception analysis, focused primarily on whether new purposes of processing are consistent with the reasonable expectations of consumers, which are, in turn, based primarily on the policies and disclosures put forward by the organization.
As U.S. state consumer privacy legislation was layered on, this picture only changed slightly, with limits based only on what a company has disclosed. California focuses on the reasonable expectations of consumers and the proportionality of processing to the disclosed purpose. In general, other states limit processing to some ideal state capturing what is adequate, relevant or reasonably necessary for a disclosed purpose.
For personal data, no state legislation restricts processing in ways that cannot be overruled by a creative privacy notice. And for sensitive personal data, no state has created restrictions that cannot be overruled by the operation of consent.
All of this may soon change.
Currently awaiting the governor's signature in the state of Maryland — incidentally, the very same D.C.-adjacent state in which your humble columnist resides — is the Maryland Online Data Privacy Act of 2024.
All the usual state privacy experts — Husch Blackwell's David Stauss, CIPP/E, CIPP/US, CIPT, FIP, PLS, Future of Privacy Forum's Jordan Francis, CIPP/US, and the IAPP's Joe Duball among them — have already flagged the innovative nature of this bill. I'll join the fray here to say once again that Maryland could be the first jurisdiction to embrace strong purpose limitations.
Data minimization, under the Maryland framework, entirely prohibits some processing of certain types of data, regardless of creative disclosures or even consent. For other data, it appears to impose a strong consent framework, with vanishingly few alternative legal bases for processing.
At the collection stage of the data life cycle, the bill makes use of a subtle but potentially powerful change in framing.
Rather than focus on the disclosures made by a company, and a consumer's reasonable interpretation thereof, the Maryland bill focuses on "the specific product or service requested by the consumer." The collection of personal data must be "reasonably necessary and proportionate" to provide or maintain the requested product or service. Full stop. No compatibility?
Placing the individual in the driver's seat in this way has not been done before. In what manner and through what mechanism does a consumer request a specific product or service? Likely through some form of consent. In any case, this is certainly a different question than whether a company has adequately disclosed purposes in a way that meets the legal requirement to set consumer expectations.
Unfortunately, ambiguities in the bill could limit its immediate impact.
For more general processing of nonsensitive data, Maryland deploys the more standard limitation language, prohibiting processing that is not "reasonably necessary to, nor compatible with, the disclosed purposes," unless overridden by consent. Does this secondary use provision allow for bare disclosure of additional purposes not related to the product or service requested by the consumer during collection? Possibly.
But Maryland doesn't stop there. It includes broad prohibitions on certain types of data processing. If enacted, these would be the first broadly applicable data minimization restrictions that may not be overridden by the operation of consent.
Maryland's bill prohibits the sale of sensitive personal data and the sale of any personal data of consumers under age 18. For the latter, it makes use of an aggressive knowledge standard. If a company "should have known" that a consumer is under 18, it is prohibited from selling their personal data.
But note the definition of "sale," which explicitly exempts any disclosure to a third party for purposes of providing a product or service affirmatively requested by a consumer. Would consent by any other name smell as sweet?
Let's examine one more provision.
For the collection and processing — other than the sale — of sensitive data, companies are limited to what is "strictly necessary" to provide or maintain a specific product or service requested by the consumer. This heightened standard will need to be tested in practice to see how well it holds water. The detailed analyses linked above point out other inconsistencies that could possibly weaken the provision in practice.
Even if it does not deliver a fully loophole-proof data minimization standard, Maryland is close to succeeding in adding to the state privacy patchwork in a meaningful way. Privacy pros will need to reexamine their assumptions to bring processing into compliance with this new framework.
The effort is not wasted. A similar approach to data minimization is on display in the discussion draft of the American Privacy Rights Act. At the federal level, the ideas being tested in Maryland are already enmeshed in bipartisan proposals.
Though the era of notice and choice may not be quite over, its end is drawing closer every day.
Please send feedback, updates and Old Bay seasoning to cobun@iapp.org.