Law in the Internet Society

View   r9  >  r8  ...
EungyungEileenChoiFirstEssay 9 - 12 Jan 2020 - Main.EungyungEileenChoi
Line: 1 to 1
 
META TOPICPARENT name="FirstEssay"
Line: 23 to 23
 The history of internet opinion manipulation in Korean politics dates way back. Several politicians from different political parties were convicted for hiring people to distribute disinformation through the internet to slander his opposing candidate in a primary race or election, in 2004, 2008, and 2012. A former head of the Korean intelligence agency is serving jail time for having ordered his subordinates to post mass comments or retweet comments in support of Ms. Park who was then running for president.
Changed:
<
<

[ ]

>
>

Ash conformity and Cambridge Analytica

 Now, some might question what real harm does it? Does the number of 'likes' or 'dislikes' really matter? Don't people have their own views and opinions? Even major news media spread incorrect or inaccurate news from time to time. More importantly, most times it is very difficult to draw a clear line between information, misinformation, and disinformation. If we were to ban any information that bears a risk of being incorrect, wouldn't that sacrifice the sacred freedom of speech?

Changed:
<
<
Well, a classic psychological experiment by Solomon Asch suggests those things matter. In his well-known conformity experiment, Asch observed that about one third (32%) of the participants conformed to the clearly wrong majority view although most of them claimed that they did not really believe their answers to be true. McLeod? , S. A. (2018, Dec 28). Solomon Asch - Conformity Experiment. https://www.simplypsychology.org/asch-conformity.html Asch suggested that it was the group pressure that was making those people respond in conformity with the majority. Also, according to the experiment, people are more likely to conform if the number of majority is larger. So, the number of 'likes' or 'dislikes' on a posting regarding a political candidate might have an influence on prospective voters. Voters might refrain from expressing their views if they see the 'likes' or 'dislikes' of the majority is different, in a poll, for example. The poll results, in turn, could affect the same or other voter's decision. False or unverified information degrading a candidate might not be believed right out but if voters are exposed to disinformation over and over again, more are likely to believe the story. As the number of believers grows, group pressure might kick in with a snowballing effect. People who don't believe the story may silence, the majority view will grow even bigger, the pressure goes up, etc. The Cambridge Analytica If coupled with behavioral targeting, The impact grows even bigger if you use personal information to sort out and target neutral or vulnerable individuals and bombard them with fake news that creates fear and hatred in them so that they discharge a candidate and vote for the other.
>
>
Well, a classic psychological experiment by Solomon Asch suggests those things matter. In his well-known conformity experiment, Asch observed that about one third (32%) of the participants conformed to the clearly wrong majority view although most of them claimed that they did not really believe their answers to be true. McLeod? , S. A. (2018, Dec 28). Solomon Asch - Conformity Experiment. https://www.simplypsychology.org/asch-conformity.html Asch suggested that it was the group pressure that was making those people respond in conformity with the majority. Also, according to the experiment, people are more likely to conform if the number of majority is larger. So, the number of 'likes' or 'dislikes' on a posting regarding a political candidate might have an influence on prospective voters. Voters might refrain from expressing their views if they see the 'likes' or 'dislikes' of the majority is different, in a poll, for example. The poll results, in turn, could affect the same or other voter's decision. False or unverified information degrading a candidate might not be believed right out but if voters are exposed to disinformation over and over again, more are likely to believe the story. As the number of believers grows, group pressure might kick in with a snowballing effect. People who don't believe the story may silence, the majority view, i.e., those who believe the story will grow even bigger, the pressure goes up, etc. "The Great Hack", a documentary on how Cambridge Analytica used personal information to interfere with the 2016 U.S. elections and many others shows that if coupled with behavioral targeting, the impact can get as big as to reverse the results of the election.
 
Changed:
<
<
By tweaking the majority view and by creating negative appearancesa candidate's fate could change the fate of a candidate, as is the suspicion of many Americans with respect to the 2016 presidential election. it will also harm the might be changed they might begin a voter might actually be affected by the number of 'likes' or 'dislikes' on a posting regarding a political candidate. And if this adds up to hundreds and thousands posting, it might even change the result of an election.

Who has actually been doing this? It doesn't seem necessary to write an essay countering this argument unless someone is really making it.
>
>
However, the impact doesn't need to be as substantial as to overturn the results of an election. Even by creating a public appearance that there might be something wrong with the election, it can create distrust and disagreement between the people. People who lost will find it harder to accept the results questioning the fairness and people who won will blame them for not respecting the results. It will polarize the people and make it harder for the community to make important decisions. According to recent news reports, U.S. cyberdefense agencies have already detected attempts where attackers used hacking and spreading disinformation to sway the 2020 American elections in a less obvious way - by creating chaos. https://nyti.ms/2uyBXk
 
Added:
>
>

Finding a way

  I want to start with making small suggestions that hopefully will help prevent misuse of the internet for political purposes:

Revision 9r9 - 12 Jan 2020 - 18:25:25 - EungyungEileenChoi
Revision 8r8 - 11 Jan 2020 - 16:12:35 - EungyungEileenChoi
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM