Law in the Internet Society

View   r11  >  r10  ...
EungyungEileenChoiFirstEssay 11 - 17 Jan 2020 - Main.EungyungEileenChoi
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 25 to 25
 

Asch Conformity and Cambridge Analytica

Changed:
<
<
But does the number of 'likes' or 'dislikes' or fake news really matter? To answer this question, let's turn to a classic psychological experiment by Solomon Asch that suggests those things matter. In his well-known conformity experiment, Asch observed that about one third (32%) of the participants conformed to the clearly wrong majority view although most of them claimed that they did not really believe their answers to be true. McLeod? , S. A. (2018, Dec 28). Solomon Asch - Conformity Experiment. https://www.simplypsychology.org/asch-conformity.html Asch suggested that it was the group pressure that was making those people respond in conformity with the majority. Also, according to the experiment, people are more likely to conform if the number of majority is larger. So, the number of 'likes' or 'dislikes' on a posting regarding a political candidate might have an influence on prospective voters. Voters might refrain from expressing their views if they see the 'likes' or 'dislikes' of the majority is different, in a poll, for example. The poll results, in turn, could affect the same or other voter's decisions. False or unverified information degrading a candidate might not be believed right out but if voters are exposed to disinformation over and over again, more are likely to believe the story. As the number of believers grows, group pressure might kick in with a snowballing effect. People who don't believe the story may silence, the majority view, i.e., those who believe the story will grow even bigger, the pressure goes up, etc. (For a more detailed explanation of how public opinion manipulation on the internet works:https://www.usenix.org/node/208126.) "The Great Hack", a documentary on how Cambridge Analytica used personal information to interfere with the 2016 U.S. elections shows that, if coupled with behavioral targeting, the impact can get as big as to reverse the results of the election. Even if the outcomes remain unchanged, by creating a public appearance that there might be something wrong in the procedure, electoral legitimacy is put at risk. So, if the Russians or any others wished to sabotage the upcoming 2020 presidential elections in the U.S., they could do it simply by creating chaos and distrust around the system. https://nyti.ms/2uyBXk
>
>
But does the number of 'likes' or 'dislikes' or fake news really matter? To answer this question, let's turn to a classic psychological experiment by Solomon Asch that suggests those things matter. In his well-known conformity experiment, Asch observed that about one third (32%) of the participants conformed to the clearly wrong majority view although most of them claimed that they did not really believe their answers to be true. McLeod? , S. A. (2018, Dec 28). Solomon Asch - Conformity Experiment. https://www.simplypsychology.org/asch-conformity.html Asch suggested that it was the group pressure that was making those people respond in conformity with the majority. Also, according to the experiment, people are more likely to conform if the number of majority is larger. So, the number of 'likes' or 'dislikes' on a posting regarding a political candidate might have an influence on prospective voters. Voters might refrain from expressing their views if they see the 'likes' or 'dislikes' of the majority is different, in a poll, for example. The poll results, in turn, could affect the same or other voter's decisions. False or unverified information degrading a candidate might not be believed right out but if voters are exposed to disinformation over and over again, more are likely to believe the story. As the number of believers grows, group pressure might kick in with a snowballing effect. People who don't believe the story may silence, the majority view, i.e., those who believe the story will grow even bigger, the pressure goes up, etc.

A typical form of public opinion manipulation on the internet includes the use of social network media and the distribution of fake news, i.e. propaganda. Social media plays a crucial role in both ways - collecting personal data from individuals and reaching out to those individuals with fake information. (For a more detailed explanation of how public opinion manipulation on the internet works:https://www.usenix.org/node/208126.) Massive data collectively make it possible to sort out the most vulnerable people, find out their weakest spots, and bombard them with fake information that targets to stimulate their behavior by attacking the very weak spot. "The Great Hack", a documentary on how Cambridge Analytica used personal information to interfere with the 2016 U.S. elections shows that, if coupled with behavioral targeting, the impact can get as big as to reverse the results of the election. Even if the outcomes remain unchanged, by creating a public appearance that there might be something wrong in the procedure, electoral legitimacy is put at risk. So, if the Russians or any others wished to sabotage the upcoming 2020 presidential elections in the U.S., they could do it simply by creating chaos and distrust around the system. https://nyti.ms/2uyBXk

 

Defending Democracy in the Age of Internet Society

While it will be up to the cybersecurity forces to monitor and detect any suspicious activities on the internet, I wonder what I, as a lawyer or as one of the people, can do to help protect the integrity and legitimacy of our democratic system against malicious attempts to manipulate public opinion.

Changed:
<
<
First, knowingly or negligently distributing news that is fake with the intention to favor a political party should be taken very seriously and punished accord ingly. Although such acts already constitute a crime in many countries, my personal view is that the punishments are often disproportionate to their negative and mostly irreparable impacts. To deter people from doing so, it is necessary to impose more severe punishment for these type of offenses.

It would be good, at this stage, at least to acknowledge the arguments that lie behind the US constitutional prohibition on punishing such activity at all. It's fine to disagree with that view of the relationship between free speech and democracy, but the arguments in favor of prohibiting criminalization of political speech aren't trivial and deserve to be reckoned with.

Second, manipulating public opinions should be determined as a separate type of crime and should be subject to severe punishment. In the above case, Duru King was only convicted for impairing the operation of the server of the portal website because there was no other criminal offense applicable to his acts.

On the other side is Justice Holmes' famous statement that "every idea is an incitement." Manipulating public opinion is the purpose of free speech, is it not?
>
>
First, knowingly distributing fake information with the intention of manipulating public opinion to harm a political candidate or party should be acknowledged as a separate crime subject to severe punishment. Of course, spreading information is a form of speech protected by the constitution in many countries. The courts in the U.S. and South Korea share views that freedom of speech includes (at least some of) false speech. See, United States v. Alvarez, 567 U.S. 709, 132 S. Ct. 2537 (2012); 2008Hunba157, Korean Constitutional Court (2010.12.28.) Also, political speech is especially strongly protected in the U.S. See, McIntyre? v. Ohio Elections Comm'n, 514 U.S. 334, 115 S. Ct. 1511 (1995). However, freedom of speech is not absolute and the government may restrict speech as long as it is justifiable. See, Life of Wash., Inc. v. Brumsickle, 2009 U.S. Dist. LEXIS 4289 (W.D. Wash. Jan. 8, 2009). In the U.S., justifiable restrictions on political speech must survive a strict scrutiny test. In Korea, a proportionality test is applied. Whatever level of scrutiny is applied, my view is that prohibition of the aforementioned kind should stand such test because i) false statements of facts have less value for they don't contribute to the market of ideas, and ii) after all, it is for democracy's sake that we advocate for freedom of speech so that, if the consequences of speech are that it actually harms democracy materially, it would be meaningless to protect such speech.
 
Changed:
<
<
Third, it should be prohibited to monitor people's online activities for purposes that have not been properly disclosed to and consented by the individual. Although websites provide notices such as cookie policies, most time the information is insufficient, too vague. Moreover, they don't provide the option to 'opt-out' because it is either you agree to the policy or you cannot use this website. Further, one cannot choose the purpose and use of one's information and must either take it or leave it all.
>
>
Second, it should be prohibited to monitor people's online activities for purposes that have not been properly disclosed to and consented by the individual. Although websites provide notices such as cookie policies, most time the information is insufficient, too vague. Moreover, they don't provide the option to 'opt-out' because it is either you agree to the policy or you cannot use this website. Further, one cannot choose the purpose and use of one's information and must either take it or leave it all.
 
It's not clear what this paragraph has to do with the preceding parts of the draft. I

Revision 11r11 - 17 Jan 2020 - 00:01:41 - EungyungEileenChoi
Revision 10r10 - 16 Jan 2020 - 20:22:35 - EungyungEileenChoi
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM