Computers, Privacy & the Constitution

The EU DIGITAL SERVICES ACT: PROTECTING INTERESTS OR EXTENDING PATERNALISM?

-- By SebastianValdezOranday - 01 Mar 2024

BACKGROUND:

In the 2020s, the European Union passed the Digital Services Act (“DSA”) and the Digital Markets Act (“DMA”); 2022’s twin acts designed to revamp the EU’s old approach to digital policing. Like the General Data Protection Regulation, the new acts apply to both EU operators and non-EU companies that interact with the EU or its citizens.

WHAT THE DMA AND DSA ACTUALLY DO:

The DMA is intended to promote competition by leveling the playing field by designating one team a “gatekeeper” and forcing the gatekeeper to advantage third parties on its platform. The DSA applies to any online intermediary, including social media platforms and digital storefronts. See the following for how the Commission distinguishes providers based on their size and influence: Digital Services Act. The largest providers must analyze societal risks, like those related to illegal content, freedom of expression, media freedom, pluralism, discrimination, and election misinformation.

While the EU highlights its goals of setting bright-line rules to lessen confusion around content moderation and to increase competitiveness among businesses, most interesting is the focus on an obviously paternalistic goal to protect “society at large” by ensuring companies are accountable to democratic control.

IMMEDIATE IMPACT:

Two days after its complete implementation, the EU Commission announced an investigation and possible proceedings against TikTok for violations of the DSA. The investigation is targeting violations related to the promotion of behavioral addictions due to system design and a failure to conduct an assessment to counter risks for a person’s mental health and radicalization.

Meta, formerly Facebook, welcomed the transparency required by the DSA and lauded the harmonization of compliance requirements among large providers. Still, compliance with the DSA for large providers costs money –– and the larger the provider, the more it will pay. Meta and TikTok? are among companies challenging the fees they were assessed to fund EU regulators to ensure compliance.

WHY THE FUSS?

This question is easier asked than answered. Broadly speaking, the EU’s actions can be boiled down to be a response to either of two developments resulting from the proliferation of social media and internet communication.

First, and more cynically, the likelier of the two options: The EU, like all great powers before it, is hoping to establish control over and reign in the hold of foreign agents over the vast majority of its citizenry. American-based platforms and providers like Facebook, Instagram, What’s App, and Google are a ubiquitous part of the EU citizen’s life, and with that presence inevitably comes the belief that one’s citizens are subject to foreign control. It’s no shock that much of the Act is geared toward the household American apps and not, say, Telegram or WeChat? .

In fairness, the EU Commission does not stand alone in their crusade to control foreign media operators. One need look no further than the U.S. government’s animosity toward TikTok? for a domestic example. Under the guise of regulating content access and protecting the next generation of Americans, Congress and the Executive seem most focused on effecting the sale of TikTok? to a US-based company. While not conclusive, such fixation shows that, for the U.S., it’s not so much about abating misinformation and disinformation, rather, it’s about making sure the suspect information is not coming from a foreign provider. While the EU act’s do not take the outwardly drastic stance that the U.S. does toward TikTok? , the same sentiment is there; content can be unsafe, so long as it is unsafe under the home country’s watchful eye.

The other explanation is that the EU, like the U.S., is embracing its paternalistic role in internet and content regulation. Unfortunately for all parties involved, this explanation for the DSA and legislation like it require us to make a couple of costly assumptions about the state of the world and the people who use the internet as a whole.

First, to validate this paternalistic role in content regulation that seems to ramp up during election years, the governing body must believe that its citizens are incapable of thinking critically or skeptically about the misinformation they might encounter on the web. Second, upon identifying these purported weaknesses, the body must believe that it is best positioned to regulate and control such content so its people’s best interests are served.

If the average citizen is really a danger to themselves from internet consumption to the extent that a government truly needs to step in and bring oversight, these measures attack the symptoms and not the cause. If governments are sincere in their assessments that these moves are about protecting citizens and not a power grab over foreign markets, they should shine the spotlight on the total lack of media literacy, particularly with respect to the rise of AI content on platforms like X, TikTok? , and even films.

AMERICAN OUTCOMES

While the DSA’s requirements are limited to the EU and its citizens, the effects are largely on American businesses, and it’s possible that shifting attitudes in American society bring the goals of the DSA to American shores, namely efforts to combat disinformation and free speech risks for online content.

These issues, or at least their identification, are increasingly popular in America in a time where the Senate questions tech leaders over concerns with platform privacy, free speech, and election misinformation. This being an election year, I suspect these conversations will only be amplified. If American providers have the systems and assessment mechanisms called for by the DSA in place already, a cultural shift toward accepting more government control of the operations of online providers is not out of the realm of possibility if American providers get on board.

Unfortunately for all involved, the root of the issue will remain unsolved, and perhaps even get worse, with such a level of control exerted by governments. If history has taught us anything, it’s that suppression of ideas will only cause people to seek them out more zealously.


You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.

Navigation

Webs Webs

r3 - 25 Apr 2024 - 16:40:55 - SebastianValdezOranday
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM