Law in the Internet Society

Learning Management Systems, Privacy, and Freedom of Thought.

-- By SapirAzur - 08 Dec 2021 (revised 9 Jan 2022)

In the past few years, we have witnessed an exponential growth in distant-learning applications, including learning management and analytics systems; with the social distancing restrictions introduced by the covid-19 pandemic, professionals worldwide advocated even more passionately for a hybrid, analytics-oriented approach for academic learning. This paper will present data collection practices of the Canvas learning management analytics system, discuss relevant privacy concerns as well as the implications to freedom of thought.

Numerous studies have indicated that learning management and analytics applications can support higher education and improve students learning by providing data about learning activities and engagement. Although limited evidence is shown to support those findings (Ifenthaler et al. 2020; Suchithra et al. 2015), in the past years, various educational institutions have partnered with learning-analytics companies to collect and utilize data to assess students' behavior and formulate a predictive analysis of performance, which will enable faculty to personalize learning.

Whether those systems are effective or not, I believe that the privacy risks and the potential harm to students' freedom of thought that they pose are substantial, particularly given the nature of the relationship between universities and students, and hence, calls for a thorough public discussion.


Canvas, a learning management system used in Columbia University, possesses an extensive student information database. This database includes personally identifiable information, such as name, gender, profile information, pictures, linked social media accounts, association with interest groups, student memberships, user engagements, browser, IP address, cookies.

According to Canvas's privacy policy, this data is collected, analyzed, and made (partially) accessible to the university, Canvas, their affiliates, and service providers, including companies whose sole purpose is to increase profitability, such as Google Analytics; Furthermore, Canvas may use the data for its purposes, including improving Canvas's platform.

Canvas claims it does not tie the information gathered using third-party analytics to identifiable information. However, there are still some apparent privacy risks regarding the data aggregation mechanism, the aggregator's identity, its security measurements, and the implications of security attacks from an AI perspective.


Canvas claims it does not tie the information gathered using third-party analytics to identifiable information. However, there are still some apparent privacy risks regarding the data aggregation mechanism, the aggregator's identity, its security measurements, and the implications of security attacks from an AI perspective, such as model inversion attack and membership inference attack. Though no meaningful information is usually revealed when looking at a single data point in the database, aggregating multiple points may lead to non-trivial insights. Thus, even unidentifiable attributes may be used to narrow down an individual in a dataset in an easy three-step query: women> age25-30> Israel. The field of statistical learning, specifically data mining, systematically leverages computational methods to infer corollaries from aggregated datasets.

Whether the data is kept adequately by the alleged data owners, or whether it is compromised, whether it is identifiable or aggregated, there is no way of knowing how and to what extent it will be used in the future by those who gain access to it.

Is learning management software the antithesis of assisting learning?

Should the process of thinking, writing, or engaging in learning activities be surveilled or managed?

Privacy risks are not the only problem of learning management software. Though technology is an inseparable part of modern civilization, I would dare argue that at their core, technological management systems not only go against our nature but also threaten our freedom of thought.

I doubt that the greatest minds can grow from technologically-surveilled managed education, as the process of learning, at its core, is meant to generate creativity and critical thought. A student should feel safe asking questions and cultivating human relationships in an educational environment, and I believe a surveilled environment genuinely harms that.

There is nothing wrong, in my mind, with personalized education software. In fact, a lot of concentrated work in different fields of science and technology, including software, hardware, agriculture, and medicine, introduced precision and personalization through big data and machine learning. Software learning should aim at the opposite of management learning, that is, to personalize the education experience, not in order to surveil or manage it, but rather to offer a richness of learning opportunities the student will not have access to otherwise.

Using new technologies is tempting- it is easy, it is profitable, and it is innovative. But its benefit comes at a risk to freedom of thought and privacy, that, as a society, we need to be concerned and conscious of, and take all the measurements to proactively.


Weighing the advantages of acquiring analytical insights to "enhance" learning against privacy risks, ethical concerns, and freedom of thought, it is becoming increasingly clear that the cost of management learning is substantial. It does not mean that the innovative or engaging approach should be forsaken, nor does it mean that the benefits of learning programs are small, but rather that an interactive system, in which students can provide feedback, must be: (1) sufficiently protective of their data and privacy; and (2) aligned with the true essence of the learning process. This is particularly important, given the special relationship between universities and their students, where students may be under a misconception that universities, places of higher education, know best and would not jeopardize their privacy or freedom of thought. From my point of view, the most desirable solution is an independent system designed to align our core values and needs. An interactive educational platform that has already proven to provide the benefits of technology, without the potential harm. A platform, such as the one these very words are written in.

You are entitled to restrict access to your paper if you want to. But we all derive immense benefit from reading one another's work, and I hope you won't feel the need unless the subject matter is personal and its disclosure would be harmful or undesirable. To restrict access to your paper simply delete the "#" character on the next two lines:

Note: TWiki has strict formatting rules for preference declarations. Make sure you preserve the three spaces, asterisk, and extra space at the beginning of these lines. If you wish to give access to any other users simply add them to the comma separated ALLOWTOPICVIEW list.


Webs Webs

r4 - 09 Jan 2022 - 17:22:51 - SapirAzur
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM