Law in the Internet Society

Fake News and Our Changing Information Ecology

-- By ArashMahboubi - 19 Feb 2017

Zuckerberg Strikes Again?

Mr. Zuckerberg has attained an unenviable record. He has done more harm to the human race than anybody else his age.” While I have been skeptical to accept his ascension to the absolute top of this list, Zuckerberg’s—and in turn Facebook’s—recent fake news crisis has fueled detractors. At a time when American democracy is reeling into an uncertain phase, a premium must be placed upon an information ecology that demands accurate information. But, for all of the blame Zuckerberg does deserve, it seems the “fake news” anger directed his way is misguided. Facebook no doubt facilitated fake news into our information ecology, but the root of the problem runs much deeper than the “trending” articles tab on Facebook. Facebook serves better as a case study than a scapegoat for the changes occurring in our information ecology. The current shift in our information ecology is marked by a more decentralized culture of journalism, enabling people to create content alongside established media institutions.

Changes in the News Industry

Technological advances have been at the forefront of changes in the news industry. National-brand advertising has given way to automated exchanges that place ads across thousands of sites, regardless of the accuracy of their content. The contemporary information ecology is distinguished by a scale, scope, and horizontal information flows unlike anything we have seen before. Politicians no longer need to rely on journalists employed by traditional, credible news sources to reach their audiences, as demonstrated by Trump’s activity on Twitter. Technology has made tapping into the national audience easier than ever. Unfortunately, this also bypasses the credibility checks the news industry had in place. There is little preventing fringe ideas and arguments from entering and rapidly spreading into our information ecology.

The potential dangers posed on our information ecology by the fake news have reached unprecedented levels. Conventionally, credible news sources with fact-checking editors dominated the news market. Readers could reasonably trust information disseminated by sources such as the New York Times. But, in an age where news can be spread instantaneously to millions of readers at the convenient click of a button, it is becoming tougher to tell which sites are trustworthy. Studies show that 14 percent of Americans called social media their “most important” source for election news; and this number is likely to increase over time. The introduction of platforms like Google and Facebook into the media ecosystem presents problems to the public which traditional journalistic outfits did not. The reliance of these platforms on algorithmic curation based on attention- and behavior-collection systems makes it more difficult for readers to identify biased or fabricated news.

Such changes have upended the business logic that once pressed journalists toward middle-of-the-road consensus. When large media outlets such as CNN, FOX News, and NBC dominated the market, they competed to attract the broadest audience possible and alienate the littlest audience possible. But, with technology broadening the spectrum of news sources, audiences can subscribe to outlets that speak directly to their unique interests, beliefs, and emotions, giving rise to outlets such as Breitbart News. Our information ecology is shifting towards audiences reinforcing their beliefs rather than being exposed to outlets appealing to the broad center of American political opinion. These trends have been in place since the beginning of the Internet, but exponentiated with the emergence of social media—such as Facebook. Readers are suddenly taking on the role of editors and publishers, and the best way to get them to read a story is to appeal to their feelings, such as fear and anger. In fact, a recent paper in Human Communication Research found that “anger was the ‘key mediating mechanism’ determining whether someone shared information on Facebook; the more partisan and enraged someone was, the more likely they were to share political news online”.

Fake News: Facebook as a Case Study

There are [[over 1.86 billion monthly active Facebook users][https://zephoria.com/top-15-valuable-facebook-statistics/]. Facebook convolutes the news by wrapping its news stories in the same skin, whether it’s from the New York Times or entirely fabricated. Facebook then forwards these packages to users without any fact-checking filter. Leading up to election-day, fake-news engagement levels surpassed that of credible mainstream news. Paul-Horner—a leader in fake-news stories who uses Facebook as his primary medium—went on record to say “I think Donald Trump is in the White House because of me”. 17 of the 20 most popular fake election stories that were shared on Facebook were pro-Trump or anti-Clinton. Some of the most popular fake stories on Facebook included WikiLeaks? confirming Hillary sold weapons to ISIS, Hillary being disqualified from holding any Federal Office, a Trump protestor being paid $3,500 by Hillary’s campaign, and that Pope Francis endorsed Trump. These articles even fooled the likes of Eric, Donald Trump’s son, and Corey Lewandowski, his campaign manager at the time, into sharing these articles.

Future of Journalism

While there may be a ways for social media platforms—like Facebook—to combat fake news, it is unlikely the deeper problem can be fixed until there are concurrent transformations in reception practices. In this light, Zuckerberg and Facebook are not the problem, but rather an entity capitalizing on a problem facing our information ecology. There are surely algorithms that exist, or can be developed, to identify sources that continually spread fake news and automatically impose restrictions on their ability to get advertisement revenues. However, this can be only seen as a temporary fix of efforts to misinform the public; one that could even lead to a game of cat and mouse, where the perpetrators dismantle their operation and launch another one. Even if fixed, misinformation can be very difficult to correct and may have lasting effects even after it is discredited. But, beneath this shallow fix lies a problem stemming from the changes in our information ecology. It is probably unfeasible to demand the delegation of traditional editorial processes to mainstream media such as Twitter and Facebook. Such a change does not necessarily have to result in less credible news. Perhaps our society will embrace a more decentralized and effective culture of critique, but we must wait to find out.

In the end, I'm no more sympathetic to Facebook here than usual, but I don't feel convinced by the account of the problem presented in this draft.

Attention- and behavior-collection systems value "content" bits according to how much attention and behavior they stimulate. This is as true of the ones run in the 20t century by William Randolph Hearst or Roy Howard as the ones run in the 21st century by Rupert Murdoch, Larry Paige, or Mark Zuckerberg. The Spanish-American war was started by "false news" that Mr Hearst intentionally fabricated. The Facebook news feed problem isn't the result of some failure on Mr Zuckerberg's part to embrace a historical morality well exemplified by Rupert Murdoch, after all.

The definition of "news" as "stuff Thomas Jefferson would want you to read if he were here and knew as much as Dean Baquet about current events" doesn't make any operational sense, even if we think we want it to. So we need to understand more completely both the "information ecology" we think we are losing---and why we think that---and the one we are gaining. That's impossible in 1,000 words of course. But it's important to try.

Navigation

Webs Webs

r3 - 19 Feb 2017 - 22:57:48 - ArashMahboubi
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM