Computers, Privacy & the Constitution

View   r3  >  r2  >  r1
LiuZihuaFirstPaper 3 - 11 May 2025 - Main.LiuZihua
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Changed:
<
<

Paying for Privacy: The Legal and Ethical Challenges of the Pay-or-Consent Model

>
>

From Consent to Coercion: The Underlying Legal Failure Behind Pay-or-Consent Models

 
Changed:
<
<
-- By LiuZihua - 24 Mar 2025
>
>
-- By LiuZihua - 24 Mar 2025 (11 May 2025 Second Revision)
 
Changed:
<
<

Introduction

Many smartphone users encounter an unsettling experience: after discussing a topic with friends, they often start seeing ads or video recommendations related to that subject on their apps. It feels intrusive, almost as if their devices are eavesdropping. However, this happens even when their microphone is turned off and virtual assistants like Google Assistant are disabled. The likely explanation is that some apps engage in behavioral advertising, which tracks users’ browsing history, purchases, and content engagement through cookies, pixels, and first-party data. By analyzing behavioral patterns, advertisers can predict user interests and deliver targeted ads.
>
>

A. Introduction

The rise of pay-or-consent models on digital platforms presents users with a misleading choice: surrender personal data for behavioral advertising or pay a fee for privacy. Though this model may seem to offer autonomy, it rests on a dangerous and deeply flawed assumption that privacy is a commodity to be traded, rather than a fundamental right.
 
Changed:
<
<
Under the EU’s Article 7 of General Data Protection Regulation (GDPR), consent for such tracking must be freely given, yet companies increasingly manipulate this requirement. One example is the Pay-or-Consent model, which forces users to choose between agreeing to data tracking or paying a fee. Although the recent European Data Protection Board (EDPB) ruled this model invalid in most cases, its opinion is non-binding, leaving user privacy rights vulnerable to ongoing threats. In Bundeskartellamt case, the Court of Justice of the European Union (CJEU) allowed privacy fees if they are “appropriate” but failed to clearly define what is “appropriate,” which implicitly confirmed the model’s legality. This paper examines the legal and ethical concerns arising from these rulings, their implications for user privacy, and proposes regulatory solutions to strengthen privacy protections.
>
>
This paper argues that the Pay-or-Consent model is not just a flawed application of data protection law, but a symptom of a deeper structural failure: the mistaken reliance on consent as a legal basis for waiving fundamental privacy rights and limiting accountability. Privacy, as an inalienable and non-negotiable right, cannot be reduced to a market transaction or subject to individual bargaining.
 
Changed:
<
<

Risks of Contradiction and Uncertainty

First, since the opinion from EDPB on Pay-or-Consent Model is not legally binding, national Data Protection Authorities (DPAs) are not required to enforce it. As a result, platforms like Meta can theoretically continue using Pay-or-Consent model while waiting for further legal challenges, exploiting this regulatory uncertainty. However, this model is economically coercive which contradicts the principle of free consent under the Article 4(11) GDPR. If some users—particularly lower-income groups—must choose between being tracked or paying a fee to protect their privacy rights, their consent is not truly voluntary. Those who cannot afford the fee may feel pressured into consenting, which also undermines Recital 42 of GDPR, which states that consent is not freely given if the user lacks a genuine choice or faces detriment for refusing or withdrawing consent.
>
>

B. Privacy Is Environmental, Not Transactional

International and domestic laws have long affirmed privacy as a fundamental human right—article 8 of the European Convention on Human Rights, article 11 of the American Convention on Human Rights, and numerous human rights treaties worldwide. By definition, fundamental rights are inalienable that they cannot be traded, surrendered, or made conditional on a person’s economic position or bargaining power. This is why legal systems do not allow individuals to “consent” to the loss of basic rights such as access to safe drinking water or personal security.
 
Changed:
<
<
Second, the CJEU’s Bundeskartellamt ruling allows platforms to charge privacy fees as long as they are “appropriate”, but it does not define what “appropriate” means. This lack of clarity leads to inconsistent interpretations across EU jurisdictions, where one regulator may see a fee as fair, while another may deem it coercive. Without an objective legal standard, this ambiguity benefits businesses like Meta by allowing them to justify charging high fees. For instance, a company can argue that its high subscription price is justified by the revenue lost from ad-tracking restrictions. Moreover, the CJEU’s implicit confirmation of Pay-or-Consent model contradicts GDPR’s principle of free consent. Even if the fee is deemed “appropriate,” requiring users to pay to withdraw consent still fails to ensure that consent is truly voluntary.
>
>
Privacy, as a similarly fundamental right, must be understood not as a transactional or waivable choice but as an environmental condition—a baseline necessary for meaningful participation in public, social, and economic life. Treating it otherwise, by allowing user consent to shield data controllers from liability, undermines the very notion that such rights are inalienable and equally guaranteed. Ultimately, any model that relies on consent to legitimize the erosion of privacy is structurally incompatible with a legal and ethical system that claims to uphold rights as universal and non-negotiable.
 
Changed:
<
<

Risk of Monetizing a Fundamental Right

More importantly, the two opinions raise a serious concern—whether privacy is becoming a privilege for those who can afford it rather than a fundamental right for all. The CJEU’s ruling serves as a green light for monetizing privacy, while the EDPB’s recent non-binding opinion functions more like a flashing yellow light—a caution sign warning of risks but unable to stop the practice.
>
>

C. The Two-Layer Failure of the GDPR’s Consent Framework

A closely related legal framework to the Pay-or-Consent model is General Data Protection Regulation (GDPR), which was designed to protect personal data and affirm privacy as a fundamental right. But its central regulatory mechanism—consent—rests on a misguided premise that individuals can authorize the use of their personal data with some conditions, and that this authorization can shield data controllers from liability. In practice, this assumption collapses under real-world conditions of power asymmetry, economic pressure, and platform dominance. This heavy reliance on “consent” has not only failed to prevent models like Pay-or-Consent but may have actively facilitated their emergence.
 
Changed:
<
<
Essentially, regardless of the appropriateness or affordability of the price, the mere trade-off between privacy rights and money undermines the nature of privacy as a fundamental human right. Grounded in both international law (e.g., the Universal Declaration of Human Rights) and domestic laws across multiple jurisdictions (e.g., Article 8 of the European Convention on Human Rights and Article 11 of the American Convention on Human Rights), privacy is recognized as a fundamental human right, essential to dignity and other freedoms—not a commodity to be bought or sold.
>
>

(a) Surface Level—It Fails to Prevent the Coercive Models like Pay-or-Consent

On a surface level, the GDPR’s reliance on consent fails to prevent exploitative models like Pay-or-Consent. Although Article 4(11) requires consent to be “freely given” and Recital 42 prohibits consent where refusal results in detriment, Pay-or-Consent model effectively violates them. Users are coerced into accepting surveillance or paying a fee, especially in monopolized digital environments with no real alternatives.
 
Changed:
<
<
Accepting this model sets a dangerous precedent, because it potentially leads to a slippery slope where other fundamental rights (e.g., due process or free speech) could also be subject to similar financial trade-offs. This risks turning inalienable rights into market-driven privileges and undermines the core principles of human rights law.
>
>
This contradiction was further exposed in two recent rulings. Although the European Data Protection Board (EDPB)’s 2024 Opinion found such models generally invalid, and the Court of Justice of the European Union (CJEU) in Bundeskartellamt case acknowledged the “clear imbalance” between users and platforms, neither opinion prohibited the practice nor addressed the deeper flaw in the legal framework that enables it. The CJEU’s vague endorsement of “appropriate” privacy fees reflects not just enforcement failure but a deeper confusion about the role of consent in protecting fundamental rights.
 
Changed:
<
<

Risk of Complicating the Already Difficult Process of Obtaining Valid Consent

Under GDPR, the most crucial requirement for conducting behavioral advertising is obtaining explicit, informed, and freely given consent. However, even though the law mandates this, ensuring valid consent in practice remains highly challenging. Organizations frequently use complex language, deceptive interface designs (i.e., dark patterns), and lengthy privacy policies to nudge users toward agreement. These tactics contribute to consent fatigue, where users accept terms without fully understanding the implications. The Pay-or-Consent model further complicates this issue by adding another layer of complexity, as users must evaluate financial trade-offs in addition to understanding the terms before deciding whether to opt in or out of tracking. This further undermines the principle of freely given consent and makes obtaining valid consent even more challenging.
>
>

(b) Deeper Level—Its Underlying Flawed Premise Enables the Emergence of These Coercive Models

At a deeper level, the GDPR’s failure stems not just from weak enforcement, but from a structural flaw in its foundational logic. By placing consent at the center of lawful data processing, the GDPR assumes that individuals can meaningfully authorize the use of their personal data and that this authorization can limit the liability of data controllers. This premise treats data privacy not as a universal, non-waivable right, but as a negotiable preference that individuals can surrender through consent mechanisms and contract-like transactions.
 
Changed:
<
<

Solutions

The following solutions could help address the risks identified above: First, to prevent the monetization of privacy as a fundamental right, the EDPB’s non-binding opinion could be codified into law to explicitly prohibit platforms from charging users for the right to refuse data tracking. Second, as an alternative, platforms could be required to offer a free version of their service that relies solely on contextual advertising, rather than behavioral tracking. This would ensure that users have a truly privacy-friendly alternative without being forced into tracking. Thirdly, if a complete ban on the Pay-or-Consent model is deemed impractical, the CJEU may establish a clear, objective standard for what qualifies as an “appropriate” privacy fee. It could also issue price guidance to prevent platforms from setting excessively high privacy fees that effectively coerce users into tracking (e.g., the fee must be strictly proportionate to actual lost ad revenue, rather than an arbitrarily high charge).
>
>
Practically, this legal premise enables, from the root, the rise of coercive models like Pay-or-Consent. Because the framework equates valid consent with lawful processing, it provides platforms with a path for compliance through appearance rather than substance. So long as a company can extract formal “consent”—regardless of the power imbalances, economic pressure, or lack of alternatives—it is insulated from legal accountability. It incentivizes companies to develop structurally coercive designs that exploit consent not as a safeguard, but as a liability shield.

Therefore, models like Pay-or-Consent are not legal anomalies but the predictable outcome of the legal structure. The implication of this structural failure is far-reaching in that data privacy is a public and environmental condition, which is essential to maintain baseline expectations of dignity, autonomy, and non-exploitation in digital life.

D. The Path Forward

To truly protect data privacy as a fundamental right, the law must go beyond banning coercive models like Pay-or-Consent and confront the deeper structural flaw in data protection frameworks: the reliance on consent as a basis for waiving rights and limiting liability. Privacy must be treated as a non-negotiable entitlement, not a market transaction. Addressing this requires legal reform that replaces consent-based frameworks with a rights-based model, which explicitly prohibits the commodification and surrender of privacy rights under the guise of individual choice.
 
Deleted:
<
<
Why not come out bravely and propose a pay or consent analysis for other environmental regulations, in which, for example, individuals can be charged more to breathe clean air or pay not to have their water poisoned?
 
Deleted:
<
<
Isn't there some point during the re-reading of the draft when you begin to wonder why you would wan to consider such an absurd argument? Or does the painfulness of admitting that the GDPR is a load of misconceived bullshit weigh so heavily that it would be better to do anything than admit the obvious?
 
Deleted:
<
<
This works well, if unintentionally, therefore, as an illustration of the point I made more than once, that privacy is not a transactional but rather an environmental contextt, in which consent is not the appropriate basis for limitation of liability. We can make the draft stronger by reexamining this fundamental assumption.
 
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable.

LiuZihuaFirstPaper 2 - 08 May 2025 - Main.EbenMoglen
Line: 1 to 1
 
META TOPICPARENT name="FirstPaper"
Deleted:
<
<
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.
 

Paying for Privacy: The Legal and Ethical Challenges of the Pay-or-Consent Model

-- By LiuZihua - 24 Mar 2025

Line: 31 to 29
 

Solutions

The following solutions could help address the risks identified above: First, to prevent the monetization of privacy as a fundamental right, the EDPB’s non-binding opinion could be codified into law to explicitly prohibit platforms from charging users for the right to refuse data tracking. Second, as an alternative, platforms could be required to offer a free version of their service that relies solely on contextual advertising, rather than behavioral tracking. This would ensure that users have a truly privacy-friendly alternative without being forced into tracking. Thirdly, if a complete ban on the Pay-or-Consent model is deemed impractical, the CJEU may establish a clear, objective standard for what qualifies as an “appropriate” privacy fee. It could also issue price guidance to prevent platforms from setting excessively high privacy fees that effectively coerce users into tracking (e.g., the fee must be strictly proportionate to actual lost ad revenue, rather than an arbitrarily high charge).
Added:
>
>
Why not come out bravely and propose a pay or consent analysis for other environmental regulations, in which, for example, individuals can be charged more to breathe clean air or pay not to have their water poisoned?

Isn't there some point during the re-reading of the draft when you begin to wonder why you would wan to consider such an absurd argument? Or does the painfulness of admitting that the GDPR is a load of misconceived bullshit weigh so heavily that it would be better to do anything than admit the obvious?

This works well, if unintentionally, therefore, as an illustration of the point I made more than once, that privacy is not a transactional but rather an environmental contextt, in which consent is not the appropriate basis for limitation of liability. We can make the draft stronger by reexamining this fundamental assumption.

 
You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

LiuZihuaFirstPaper 1 - 24 Mar 2025 - Main.LiuZihua
Line: 1 to 1
Added:
>
>
META TOPICPARENT name="FirstPaper"
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Paying for Privacy: The Legal and Ethical Challenges of the Pay-or-Consent Model

-- By LiuZihua - 24 Mar 2025

Introduction

Many smartphone users encounter an unsettling experience: after discussing a topic with friends, they often start seeing ads or video recommendations related to that subject on their apps. It feels intrusive, almost as if their devices are eavesdropping. However, this happens even when their microphone is turned off and virtual assistants like Google Assistant are disabled. The likely explanation is that some apps engage in behavioral advertising, which tracks users’ browsing history, purchases, and content engagement through cookies, pixels, and first-party data. By analyzing behavioral patterns, advertisers can predict user interests and deliver targeted ads.

Under the EU’s Article 7 of General Data Protection Regulation (GDPR), consent for such tracking must be freely given, yet companies increasingly manipulate this requirement. One example is the Pay-or-Consent model, which forces users to choose between agreeing to data tracking or paying a fee. Although the recent European Data Protection Board (EDPB) ruled this model invalid in most cases, its opinion is non-binding, leaving user privacy rights vulnerable to ongoing threats. In Bundeskartellamt case, the Court of Justice of the European Union (CJEU) allowed privacy fees if they are “appropriate” but failed to clearly define what is “appropriate,” which implicitly confirmed the model’s legality. This paper examines the legal and ethical concerns arising from these rulings, their implications for user privacy, and proposes regulatory solutions to strengthen privacy protections.

Risks of Contradiction and Uncertainty

First, since the opinion from EDPB on Pay-or-Consent Model is not legally binding, national Data Protection Authorities (DPAs) are not required to enforce it. As a result, platforms like Meta can theoretically continue using Pay-or-Consent model while waiting for further legal challenges, exploiting this regulatory uncertainty. However, this model is economically coercive which contradicts the principle of free consent under the Article 4(11) GDPR. If some users—particularly lower-income groups—must choose between being tracked or paying a fee to protect their privacy rights, their consent is not truly voluntary. Those who cannot afford the fee may feel pressured into consenting, which also undermines Recital 42 of GDPR, which states that consent is not freely given if the user lacks a genuine choice or faces detriment for refusing or withdrawing consent.

Second, the CJEU’s Bundeskartellamt ruling allows platforms to charge privacy fees as long as they are “appropriate”, but it does not define what “appropriate” means. This lack of clarity leads to inconsistent interpretations across EU jurisdictions, where one regulator may see a fee as fair, while another may deem it coercive. Without an objective legal standard, this ambiguity benefits businesses like Meta by allowing them to justify charging high fees. For instance, a company can argue that its high subscription price is justified by the revenue lost from ad-tracking restrictions. Moreover, the CJEU’s implicit confirmation of Pay-or-Consent model contradicts GDPR’s principle of free consent. Even if the fee is deemed “appropriate,” requiring users to pay to withdraw consent still fails to ensure that consent is truly voluntary.

Risk of Monetizing a Fundamental Right

More importantly, the two opinions raise a serious concern—whether privacy is becoming a privilege for those who can afford it rather than a fundamental right for all. The CJEU’s ruling serves as a green light for monetizing privacy, while the EDPB’s recent non-binding opinion functions more like a flashing yellow light—a caution sign warning of risks but unable to stop the practice.

Essentially, regardless of the appropriateness or affordability of the price, the mere trade-off between privacy rights and money undermines the nature of privacy as a fundamental human right. Grounded in both international law (e.g., the Universal Declaration of Human Rights) and domestic laws across multiple jurisdictions (e.g., Article 8 of the European Convention on Human Rights and Article 11 of the American Convention on Human Rights), privacy is recognized as a fundamental human right, essential to dignity and other freedoms—not a commodity to be bought or sold.

Accepting this model sets a dangerous precedent, because it potentially leads to a slippery slope where other fundamental rights (e.g., due process or free speech) could also be subject to similar financial trade-offs. This risks turning inalienable rights into market-driven privileges and undermines the core principles of human rights law.

Risk of Complicating the Already Difficult Process of Obtaining Valid Consent

Under GDPR, the most crucial requirement for conducting behavioral advertising is obtaining explicit, informed, and freely given consent. However, even though the law mandates this, ensuring valid consent in practice remains highly challenging. Organizations frequently use complex language, deceptive interface designs (i.e., dark patterns), and lengthy privacy policies to nudge users toward agreement. These tactics contribute to consent fatigue, where users accept terms without fully understanding the implications. The Pay-or-Consent model further complicates this issue by adding another layer of complexity, as users must evaluate financial trade-offs in addition to understanding the terms before deciding whether to opt in or out of tracking. This further undermines the principle of freely given consent and makes obtaining valid consent even more challenging.

Solutions

The following solutions could help address the risks identified above: First, to prevent the monetization of privacy as a fundamental right, the EDPB’s non-binding opinion could be codified into law to explicitly prohibit platforms from charging users for the right to refuse data tracking. Second, as an alternative, platforms could be required to offer a free version of their service that relies solely on contextual advertising, rather than behavioral tracking. This would ensure that users have a truly privacy-friendly alternative without being forced into tracking. Thirdly, if a complete ban on the Pay-or-Consent model is deemed impractical, the CJEU may establish a clear, objective standard for what qualifies as an “appropriate” privacy fee. It could also issue price guidance to prevent platforms from setting excessively high privacy fees that effectively coerce users into tracking (e.g., the fee must be strictly proportionate to actual lost ad revenue, rather than an arbitrarily high charge).


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Revision 3r3 - 11 May 2025 - 11:24:04 - LiuZihua
Revision 2r2 - 08 May 2025 - 15:22:12 - EbenMoglen
Revision 1r1 - 24 Mar 2025 - 19:01:14 - LiuZihua
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM