Do we control technology, or does technology control us? Consider a scenario where you are driving home from work, and Google Maps diverts you from your regular course to a circuitous journey that is apparently faster. The accuracy of Google Maps hardly leaves the decision to the driver. The predictive certainty is only improving, further removing human decision-making from the equation. Delegating relatively trivial decisions, such as driving directions, to computer systems may not draw concern. However, near-certain predictions in areas such as crime and death can significantly shape our lives without an ounce of human decision-making.
At the core of these technologies is data. The more specific data is to the subject of the prediction, the more accurate the prediction; hence the use of personal data moves from predictions in the aggregate to certainty for an individual. Businesses continue to make significant investments in data processing that enable data-driven decisions in real time.
This Note does not dive deep into the specifics of the numerous data privacy laws, nor should it stand as a warning of a technological doomsday. Rather, it considers the core principles behind data privacy regulations, how advancements shift to a hyper-personalization experience that runs counter to these principles, and proposes a regulatory method that may address data privacy to achieve these principles.
This Note begins by providing the backdrop of data privacy. The Note discusses traditional privacy concepts that have recently evolved to tackle the data privacy issues of autonomy and control in the digital era. Next, the Note classifies data based on a Johari window model to make sense of the broad groups of data and the relationships between the predictive systems and the individual. Building on data classification, Part IV shows how many data privacy regulations attempt to alter power dynamics by shifting control of data and decision-making. However, for individuals to achieve the highest level of autonomy and control, regulations must ultimately address the results of the data and the information asymmetries associated with our blind spots. In conclusion, the Note proposes a taxation on hyper-personalization as a way for consumers to take control, not necessarily of their data, but of decisions in their lives.