{"id":1205,"title":"Digital public: looking at what algorithms actually do","link":"https:\/\/www.reframetech.de\/en\/2018\/02\/13\/digital-public-looking-at-what-algorithms-actually-do\/","date":"02\/13\/2018","date_unix":1518512400,"date_modified_unix":1659961258,"date_iso":"2018-02-13T09:00:00+00:00","content":"<p>The development and expansion of today\u2019s communications platforms have led to a radical change in how public discourse is conducted and public opinion formed. In particular, the traditional boundary between personal and public communication has disappeared.<\/p>\n<p>A prime example is a 2017 case involving the American actor William Shatner \u2013 best known for having played the character Captain Kirk in the 1960s TV series <em>Star Trek<\/em> \u2013 tweeted about the organization <strong><a href=\"http:\/\/www.slate.com\/articles\/health_and_science\/science\/2017\/04\/what_we_can_learn_from_william_shatner_s_twitter_meltdown.html\" target=\"_blank\" aria-label=\"Opens in a new tab\" >Autism Speaks<\/a><\/strong>, known for its claims that autism is caused by vaccines. Among others, David Gorski, an oncologist at Wayne State University in Detroit who advocates for evidence-based interventions, replied to Shatner\u2019s tweet and explained why Autism Speaks is a controversial organisation. In response, Shatner searched for Gorski\u2019s name on Google and shared articles about him from a conspiracy-oriented website called TruthWiki. Asked why he had not read and linked <strong><a href=\"https:\/\/en.wikipedia.org\/wiki\/David_Gorski\" target=\"_blank\" aria-label=\"Opens in a new tab\" >Gorski\u2019s Wikipedia entry<\/a><\/strong>, Shatner responded that TruthWiki was higher up in his Google search results. You can find it \u201call on Google,\u201d he <strong><a href=\"https:\/\/twitter.com\/WilliamShatner\/status\/849773578559959040\" target=\"_blank\" aria-label=\"Opens in a new tab\" >maintained<\/a><\/strong>, as if that itself was a sign of high quality.<\/p>\n<p>Google and other platforms are incredibly powerful tools that allow all of us \u2013 and Shatner, too \u2013 to locate information in the blink of an eye. To do so they use computer algorithms that measure \u201crelevance\u201d, but the standards used often do not correspond to the criteria that reputable journalists or researchers would use.<\/p>\n<h4>Custom-fitted \u2018relevance\u2019<\/h4>\n<p>Algorithms work mostly<em> descriptively<\/em> and <em>individually<\/em>. For example, they adjust relevance for a user based on what links he or she has clicked in the past. Yet many users assume the results are normative (\u201chigher up in the Google results\u201d). In the Shatner\/Gorski case, the assertion of a correlation between autism and vaccines is encouraged a small but highly motivated user group in their online activities and ensured that a significant divergence occurred between content quality and \u201crelevance\u201d as determined by Google\u2019s algorithms.<\/p>\n<p>This is not simply a matter of a handful of telling cases. Because of their ubiquity, so-called intermediaries such as Google and Facebook now influence how public opinion is formed. <strong><a href=\"http:\/\/www.die-medienanstalten.de\/fileadmin\/Download\/Veranstaltungen\/Pr%C3%A4sentation_Intermedi%C3%A4re\/TNS_Intermedi%C3%A4re_und_Meinungsbildung_Pr%C3%A4si_Web_Mappe_final.pdf\" target=\"_blank\" aria-label=\"Opens in a new tab\" >57% of German Internet users <\/a><\/strong>get their information about politics and social affairs from search engines or social networks. And even though the share of those who say social networks are their most important source of news is relatively small \u2013 <strong><a href=\"http:\/\/www.die-medienanstalten.de\/fileadmin\/Download\/Veranstaltungen\/Pr%C3%A4sentation_Intermedi%C3%A4re\/TNS_Intermedi%C3%A4re_und_Meinungsbildung_Pr%C3%A4si_Web_Mappe_final.pdf\" target=\"_blank\" aria-label=\"Opens in a new tab\" >6% of all Internet users<\/a><\/strong> \u2013 it is considerably higher among younger users.<\/p>\n<p>As researchers at the Hamburg-based Hans Bredow Institute put it in 2016, the formation of public opinion is \u201c<strong><a href=\"http:\/\/www.hans-bredow-institut.de\/webfm_send\/1172\" target=\"_blank\" aria-label=\"Opens in a new tab\" >no longer conceivable without intermediaries<\/a><\/strong>\u201d.<\/p>\n<h4>Maximising engagement<\/h4>\n<p>The design principles used by intermediaries are leading to a structural change in public discourse. Anyone can now publish whatever they like, but not everyone will find an audience. Attention is generated only when people interact with algorithmic decision-making (ADM) processes. ADM processes determine the individual relevance of content items on social networks such as Facebook and select the items to be displayed for each user. In assembling an individual user\u2019s feed, Facebook examines which content that person and his or her friends prefer or hide. Both signals are based on actions that are relatively straightforward.<\/p>\n<p>Facebook also undoubtedly deploys signals that users are not consciously aware of sending, such as the amount of time they view a certain entry in the feed. Users who spend more time with any one item <strong><a href=\"http:\/\/www.slate.com\/articles\/technology\/cover_story\/2016\/01\/how_Facebook_s_news_feed_algorithm_works.html\" target=\"_blank\" aria-label=\"Opens in a new tab\" >signal approval without explicitly doing so<\/a><\/strong>. ADM systems play a significant role in other areas, like assisting in legal matters or determining where and when <strong><a href=\"https:\/\/www.brennancenter.org\/legal-work\/brennan-center-justice-v-new-york-police-department\" target=\"_blank\" aria-label=\"Opens in a new tab\" >police officers are on duty<\/a><\/strong>.<\/p>\n<p>There is much less diversity among intermediaries than among editorially curated media. Even if each person using the services provided by today\u2019s major intermediaries is given an individual choice, the same selection principles are applied to all users, and these are controlled by centralised curators. The new, crucial role played by users reactions and ADM processes is that both determine how much attention the content gets when disseminated.<\/p>\n<h4>Negative emotions and cognitive distortions<\/h4>\n<p>Studies of networking platforms show that content that rouses emotion is commented on and shared most often \u2013 and above all when negative emotions are involved.<\/p>\n<p>Such polarizing effects seem to depend on a number of additional factors such as a country\u2019s electoral system. Societies with \u201cfirst past the post\u201d systems such as the United States are potentially <strong><a href=\"https:\/\/5harad.com\/papers\/bubbles.pdf\" target=\"_blank\" aria-label=\"Opens in a new tab\" >more vulnerable to extreme political polarisation<\/a><\/strong>. In countries with proportional systems, institutionalised multiparty structures and ruling coalitions tend to balance out competing interests.<\/p>\n<div class=\"postContentEmbed\">\n<div class=\"embedContainer embedContainer--video\"><iframe loading=\"lazy\" title=\"How YouTube&#039;s algorithm distorts reality\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube-nocookie.com\/embed\/aTxUetlqWmU?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe><\/div>\n<\/div>\n<h6>How YouTube&#8217;s algorithm can distort reality<\/h6>\n<p>Existing societal polarisation presumably influences and is influenced by the algorithmic ranking of media content. A <a href=\"http:\/\/www.pnas.org\/content\/113\/3\/554\" target=\"_blank\" aria-label=\"Opens in a new tab\" ><strong>2016 study published by National Academy of Sciences<\/strong><\/a> indicates that Facebook users who believe in conspiracy theories tend over time to turn to the community of conspiracy theorists holding the same view. This process is possibly intensified by algorithms that increasingly present them with content \u201crelevant\u201d to their views. These systems could in fact result in the creation of so-called echo chambers among people with extremist views.<\/p>\n<p>Below are three aspects of intermediary platforms that can influence the formation of individual and public opinions:<\/p>\n<ul>\n<li><strong>Intermediaries measure engagement through users\u2019 automatic, impulsive reactions.<\/strong> They use numerous variables to calculate relevance, ranging from basic behavioural metrics such as scrolling speed or the duration of page views to the level of interaction among multiple users in a social network. When someone with whom a user has repeatedly communicated on Facebook posts content, the probability is higher that the user will be shown this content than if someone\u2019s posts with whom the user has never truly interacted.<\/li>\n<li><strong>Intermediaries constantly change the variables they measure.<\/strong> The metrics signalling relevance are potentially problematic. Platform operators are hesitant to provide details of their metrics because of competition-related factors and the fact that they constantly changing the metrics. Google and Facebook alter their systems continuously; the operators<strong><a href=\"https:\/\/www.facebook.com\/notes\/Facebook-data-science\/big-experiments-big-datas-friend-for-making-decisions\/10152160441298859\/\" target=\"_blank\" aria-label=\"Opens in a new tab\" > experiment with and tweak almost every aspect of the user interface <\/a><\/strong>and other platform features to achieve specific goals such as increased interactivity.<\/li>\n<li><strong>Intermediaries with the greatest reach promote unconsidered behaviour.<\/strong> Clicking on a \u201clike\u201d button or a link demands no cognitive effort, and many users are evidently happy to indulge this lack of effort. <strong><a href=\"https:\/\/hal.inria.fr\/hal-01281190\/file\/sigm095-gabielkov.pdf\" target=\"_blank\" aria-label=\"Opens in a new tab\" >Empirical studies<\/a><\/strong> by the French National Institute for Computer Science (INRIA) and Columbia University suggest that many articles in social networks forwarded with a click to the user\u2019s circle of friends could not possibly have been read. Users thus disseminate media content after having seen only the headline and introduction. To some extent they deceive the algorithm and, with it, their \u201cfriends and followers\u201d into believing that they have engaged with the text.<\/li>\n<\/ul>\n<p>The ease of interaction also promotes cognitive distortions that have been known to social psychologists for years. A prime example is the \u201cavailability\u201d heuristic: If an event or memory can easily be recalled, it is <a href=\"http:\/\/psiexp.ss.uci.edu\/research\/teaching\/Tversky_Kahneman_1974.pdf\" target=\"_blank\" aria-label=\"Opens in a new tab\" >assumed to be particularly probable or common.<\/a> Users frequently encounter unread media content that has been forwarded due to a headline, and the content is thus later remembered as being \u201ctrue\u201d or \u201clikely.\u201d This is also the case when the text itself points out that the headline is a grotesque exaggeration or simply misleading.<\/p>\n<h4>The need for diversity and transparency<\/h4>\n<p>Ensuring a diversity of media in the public sphere means ensuring that the ADM processes that assess relevance are diverse as well. Algorithms that rank content and personalise its presentation are the heart of the complex, interdependent process underlying digital discourse. To bring transparency to ADM processes we need to:<\/p>\n<ul>\n<li>Make platforms and their impacts more open to external researchers.<\/li>\n<li>Promote diversity among algorithmic processes.<\/li>\n<li>Establish a code of ethics for developers.<\/li>\n<li>Make users more aware of the mechanisms now being used to influence public discourse are essential.<\/li>\n<\/ul>\n<p>Organisations working for this kind of transparency include <strong><a href=\"https:\/\/algorithmwatch.org\/en\/\" target=\"_blank\" aria-label=\"Opens in a new tab\" >Algorithm Watch<\/a><\/strong>, based in Germany, and the US media watchdog <a href=\"https:\/\/www.propublica.org\/article\/making-algorithms-accountable\" target=\"_blank\" aria-label=\"Opens in a new tab\" ><strong>Pro Publica<\/strong><\/a>, which have published a number of donation-funded studies and articles on the issue.<\/p>\n<p>Through the combination of industry self-regulation and legislative measures, an unbiased understanding of the real social and political consequences of algorithmic ranking has the potential to identify and counter dangers early on.<\/p>\n<hr \/>\n<p>This article was originally published on <a href=\"http:\/\/theconversation.com\" target=\"_blank\" aria-label=\"Opens in a new tab\" >The Conversation<\/a>. Read the <a href=\"https:\/\/theconversation.com\/digital-public-looking-at-what-algorithms-actually-do-91119\" target=\"_blank\" aria-label=\"Opens in a new tab\" >original article<\/a>.<\/p>\n","excerpt":"<p>The development and expansion of today\u2019s communications platforms have led to a radical change in how public discourse is conducted [&hellip;]<\/p>\n","thumbnail":"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2018\/02\/AB.jpg","thumbnailsquare":"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2018\/02\/AB-370x370.jpg","authors":[{"id":35,"name":"Christian St\u00f6cker","link":"https:\/\/www.reframetech.de\/blogger\/christian-stoecker\/"},{"id":10,"name":"Konrad Lischka","link":"https:\/\/www.reframetech.de\/blogger\/konrad-lischka-2\/"}],"categories":[{"id":2,"name":"Uncategorized","link":"https:\/\/www.reframetech.de\/en\/category\/uncategorized\/"}],"tags":[{"id":22,"name":"echo chambers","link":"https:\/\/www.reframetech.de\/en\/tag\/echo-chambers\/"},{"id":626,"name":"Facebook","link":"https:\/\/www.reframetech.de\/en\/tag\/facebook-en\/"},{"id":23,"name":"filter bubbles","link":"https:\/\/www.reframetech.de\/en\/tag\/filter-bubbles\/"},{"id":113,"name":"Google","link":"https:\/\/www.reframetech.de\/en\/tag\/google\/"},{"id":114,"name":"Poliarization","link":"https:\/\/www.reframetech.de\/en\/tag\/poliarization\/"}]}