We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Chapter 4 delves into two efforts to reinforce consent: opt-in and informed choice. It illustrates why, in the information economy, they also fail. Power asymmetries enable systemic manipulation in the design of digital products and services. Manipulation by design thwarts improved consent provisions, interfering with people’s decision-making. People’s choices regarding their privacy are determined by the designs of the systems with which they interact. European and American attempts to regulate manipulation by changing tracking from ‘opt-out’ to ‘opt-in’ and reinforcing information crash on the illusion of consent. Contract law doctrines that aim to reduce manipulation are unsuitable because they assume mutually beneficial agreements, and privacy policies are neither. Best efforts to strengthen meaningful consent and choice, even where policies are specifically intended to protect users, ultimately are insufficient because of the environment in which privacy “decisions” take place.
The State has been a mythological entity through its history, from its sovereign phase to its present dispersed, nodal, regulatory phase. This dispersal raises important questions about gradual disappearance of public accountability. It also points to such other key issues as the dilution of personal responsibility, especially when considered in the context of the determinative implications of neuroscientific research. These trends are further emphasised by the increasingly avaricious, non-consensual digitisation of the State and the threat to democratic values posed by such trends as data brokering and algorithmic friendliness. The consequential move to a non-mythological State can be produced by the reimagining of agencies as purpose-based and which operate on existential, fiduciary principles in a manner that avoids Pettit’s republicanism. How this transition can take place is evidenced by the difference between mythological and non-mythological criminal justice, a model for which is presented.
Privacy has traditionally been conceptualized in an individualistic framing, often as a private good that is traded off against other goods. This chapter views the process of privacy enforcement through the lens of governance and situated design of sociotechnical systems. It considers the challenges in formulating and designing privacy as commons (as per the Governing Knowledge Commons framework) when privacy ultimately gets enacted (or not) in complex sociotechnical systems. It identifies six distinct research directions pertinent to the governance and formulation of privacy norms, spanning an examination of how tools of design could be used to develop design strategies and approaches to formulate, design, and sustain a privacy commons, and how specific technical formulations and approaches to privacy can serve the governance of such a privacy commons.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.