Law in the Internet Society
It is strongly recommended that you include your outline in the body of your essay by using the outline as section titles. The headings below are there to remind you how section and subsection titles are formatted.

Australia’s Child Social Media Ban as Regulatory Capture

-- By JohnGeorge - 20 Dec 2024

John George Eben Moglen Law in the Internet Society 9 December 2024 Australia’s Child Social Media Ban as Regulatory Capture

Please note: I can provide references/sources as needed, but my footnotes do not show up here.

In November of 2024, the Commonwealth of Australia passed one of the most stringent social media regulations ever created in the developed world. Known as the Online Safety Amendment (Social Media Minimum Age) Bill 2024, the comprehensive legislation creates a series of requirements that social media companies must adhere to, with the principal goal of banning individuals under the age of 16 from using social media. The legislation prohibits individuals under the age of 16 from accessing social media platforms, including global giants like Facebook, Instagram, TikTok? , and X (formerly Twitter). While the law’s stated purpose is to protect young users from cyberbullying, exploitation, and exposure to harmful content, its broader implications extend far beyond safety. This policy exemplifies how well-intentioned regulation can unintentionally reinforce the market dominance of Big Tech, suppress competition, and stifle innovation.

This Australian social media legislation is part of a broader trend across Western societies which involves holding social media platforms accountable for the many egregious harms they cause to our species, and most disturbingly, our children, during the most vulnerable and susceptible years of psychosocial development. As parents around the world have discovered to their horror in recent years, social media platforms have become the center of many young people’s existence, and during the particularly sensitive years of adolescence, children are being given tiny bricks through which their neural circuits and reward centers are constantly exploited so as to sell their attention to advertisers, who pay a pretty penny for perhaps the most effective per-dollar advertising product that has ever been sold, owing to its algorithmic and data-driven precise user targeting. Any user of social media is voluntarily allowing themselves to become a product that is sold – slavery in postmodernity is entered into willingly, eagerly, by the slave, and to take away the security of bondage is to expose the slave to the horrors of freedom. This is being done to elementary age children. It’s about time somebody is doing something to stop it!

Alas, this is the story one might tell. But this is not all. For behind the magic curtain of postmodernity’s Baudrillardian simulacrum, this façade of technological doublespeak that has infected our businesses, our children’s lives, our democracies – lies a deeper and darker truth. The “social media ban” is not the empowered voice of the citizenry, speaking through her elected representatives, that it purports to be and is widely recognized as. In reality, this is yet another act of self-cannibalism by the parasite that has captured our loftiest institutions and devours its own children – like Kronos himself – to sustain its existence.

The Online Safety Amendment (Social Media Minimum Age) Bill 2024 does reflect a serious attempt to control the influence of social media on children, which is a laudable goal in isolation. However, as is often the case with shadowy attempts at “legal” corruption, the devil is in the details. Central to the legislation is an extensive mandate that requires social media platforms to implement a range of stringent age-verification mechanisms, which are codified by the standard that providers must “take reasonable steps to prevent children who have not reached a minimum age from having accounts.” This vague standard leaves it up to courts and regulatory agencies to determine what constitutes “reasonable,” a process which will undoubtedly evolve into a compliance burden that disproportionately disadvantages entrepreneurs and smaller providers of social media services. For companies that do not comply, the penalties range up to $32.5 million.

Advocates for the bill emphasize its protective intent, supposedly to shield minors from various dangers such as cyberbullying, exploitation, and obscene content – yet, the bill as approved by both houses of the Australian parliament leaves the actual substance of regulation to other bodies, and thus divorces the law from the voice of the people as spoken through their legislators. As the government of Australia has stated, the law uses “rule‐making powers for the Minister for Communications to narrow or further target the definition” of “reasonable steps,” and even gives powers to the “eSafety Commissioner” and “Information Commissioner” to “seek information relevant to monitoring compliance, and issue and publish notices regarding non‐compliance,” opening the door for the substance of the law to be created by revolving-door officials whose main goal is to harm big tech’s competition. While the language is spoken of as being for the purpose of providing flexibility and dynamism in regulation, what it does is hand off lawmaking authority to bureaucrats in a regulatory agency, insulated from democratic accountability. The effect of this scheme is to allow large platforms to meet the guidelines by simply “throwing money at the problem,” essentially by hiring a few more compliance employees, adding in some perfunctory age verification requirements, and paying lawyers to defend potential penalties, or simply pay the fine. For the wealthiest technology companies, the fines that will be assessed under this regulation are merely the cost of doing business. For startups, venture capitalists, and entrepreneurs, the penalties can be the end of business. Of course, that’s exactly what big tech wants.

The threat of regulatory capture becomes far more real when one sees the disproportionate power of the eSafety Commissioner in formulating and enforcing rules under the act. As outlined in Section 27, the Commissioner is tasked with creating these guidelines but is not explicitly bound by any sort of public consultations or technical standards. Moreover, the government is even explicit about how much discretion these regulators have, and worse, there are even pre-built exemptions: “the Government proposes to use the rule‐making power to exclude messaging, online games, services that primarily function to support the education and health of end‐users, and YouTube? .” WhatsApp? has also been preliminarily exempted. This lack of transparency and explicit bias in the messaging will entrench the interest of existing big tech companies, and make it more difficult for new businesses. For those seeking to build services that are more privacy-protected, or collect less user data, or that are built on free and open-source software, big tech companies can use their lobbying dollars, at least in Australia, to make these projects cost-prohibitive or to kill them once they’ve begun. While some regulatory changes might affect all businesses positively or negatively, a substantial number, including this one, are a net positive or net negative for businesses, because the main threat to many businesses is not the government, but rather their competition. Things that are very bad for business in general might actually be very good for some businesses because the threat neutralizes their competition, rendering it a net positive for them. A big business will say that the enemy of my enemy is my friend. If my competition is my biggest problem, maybe big government can punch both of us, but because I’m bigger, I survive and my competition is gone.

The Online Safety Amendment (Social Media Minimum Age) Bill 2024, as it is titled, is framed as a protective bill that finally keeps the damaging effects of social media away from children. In reality, it shows the implications of regulatory overreach and even regulatory capture, and creates the perhaps counterintuitive – but all-too-real – fact that regulation is good for established players. It slaps a set of almost comically vague standards on social media companies, small and large, with no caps or exceptions for small businesses or startups, and delegates a similarly comical amount of authority to regulators to create and enforce the law. The pre-built exceptions for the largest big tech services and the lack of democratic oversight means that this is a recipe for regulatory capture. It will do the Australian citizenry well to wake up to this fact, but alas, they are not America, so they will stay sleeping with the kangaroos.

Navigation

Webs Webs

r1 - 20 Dec 2024 - 00:45:53 - JohnGeorge
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM