“It seems that the great doctrine of Development rules not only in biology and theology, but in the law as well; so that whenever, in the long process of civilization, man generates a capacity for being made miserable by his fellows in some new way, the law, after a decent interval, steps in to protect him.” (Atlantic Monthly, 1891)
We hear stories and controversies about “privacy” on a daily basis, but we may find it difficult to define. Privacy has grown into that space of broad concepts such as “liberty” or “freedom” whose meaning we understand in a general sense even if we do not have precise words to capture them fully.
Part of that struggle comes from privacy evolving as an idea to encompass new concerns stemming from new technologies. We can easily understand why someone reading over our shoulder might be a violation of privacy, but when you browse the Web, are there equivalent actions to “reading over your shoulder,” and if so, how do we even know when someone is engaging in such violations? Yet as new tools create new opportunities to collect and process data at scale, we still needed a way to describe discomforts that arose from those changes in information flows.
In her framework of contextual integrity, Dr. Helen Nissenbaum (professor of information science at Cornell Tech) recognized that when information gets passed around or used to provide services, it implicitly carries with it certain norms for where it should go and how it should be used. Many times when someone identifies a technology as “creepy,” they are instinctively identifying a violation of those norms.
For example, if you meet a friend for lunch at a restaurant and discuss events in your life, you might not share highly sensitive information given the publicly accessible environment. However, you likely still maintain some expectations around your privacy (perhaps in an ethical sense if not legal); even though people nearby can potentially overhear your conversation, you would likely be surprised if someone transcribed everything you said. You intended to share your words only with your friend, even if you used a semi-public space for convenience and other trade-offs.
Nissenbaum’s framework helps explain why we now use the term “privacy” for more than simply “keeping something private” (note that “liberty” and “liberal” are also similar words with different frames of reference). We often exercise more granular levels of control over what we share and with whom. From a legal and philosophical standpoint, professor Adam D. Moore once broadly defined a right to privacy as “a right to control access to and uses of—places, bodies, and personal information.”
But as others have noted, the impacts of misusing data often come at a collective level, and nowadays it can be difficult for individuals to evaluate all of the impacts of given data flows when making decisions about access. Over time, many have recognized that the organizations who process someone’s information carry a responsibility to act in that consumer’s best interest; the ways technology has changed how we can gather, share, and process information are so manifold now that relying on every individual involved to ensure appropriate information flows across a system becomes untenable.
Such observations help form what I have found to be a useful working definition of data privacy in a product context: Privacy is respecting people’s expectations, agency, and interests when handling information about them.
You can unpack quite a bit from that one sentence, which is what I hope to do with this newsletter. Through news stories and case studies, I want to share more about how this plays out in practice, especially from a product/UX perspective. I often see privacy treated as a legal/compliance issue or a subset of security, but with this newsletter I hope to illustrate a broader perspective on how privacy can contribute to more innovative and useful products.