{"id":948,"title":"\u201cMy Colleague the Robot\u201d \u2013 How People and Automated Assistants Work Together at Wikipedia.","link":"https:\/\/www.reframetech.de\/en\/2017\/12\/06\/my-colleague-the-robot-how-people-and-automated-assistants-work-together-at-wikipedia\/","date":"12\/06\/2017","date_unix":1512571508,"date_modified_unix":1634652841,"date_iso":"2017-12-06T14:45:08+00:00","content":"<p><em>Bots take care of many routine tasks at Wikipedia. The automated assistants greatly influence how the community interacts at the online encyclopedia. Wikipedians have learned to respond to the social effects of algorithms \u2013 for example by creating their own bot policy.<\/em><\/p>\n<p><!--more--><\/p>\n<p>Wikipedia is a giant laboratory for exploring how algorithms and people can work together (see Part 1 of our series on Wikipedia bots: \u201c<strong><a href=\"https:\/\/www.reframetech.de\/en\/2017\/11\/08\/waechter-des-weltwissens-wie-automaten-wikipedia-beschuetzen\/\">Guardians of Global Knowledge<\/a><\/strong>\u201d). Wikipedians working on the English-language Wikipedia put together an extensive set of guidelines for using automated bots soon after the site was launched. And while every Wikipedia user can run bots on their own computer to automatically change articles in the encyclopedia, anyone who does so without permission from the Wikipedia community risks having their account blocked.<\/p>\n<p><strong>Wikipedia\u2019s six laws of robotics<\/strong><\/p>\n<p>The Wikimedia Foundation operates servers which volunteers can use to store their bots. The advantage is that programs can be tested there without doing any harm, and operators don\u2019t have to worry about damaging the infrastructure that runs the software. Bots <strong><a href=\"https:\/\/en.wikipedia.org\/wiki\/Wikipedia:Bot_policy\" target=\"_blank\" aria-label=\"Opens in a new tab\" >are given certain privileges <\/a><\/strong>depending on the community in question. For example, they can make more changes per minute than a normal user is permitted to carry out, and their input is generally not checked for vandalism as often as that of human authors. Ultimately, each community working on Wikipedia\u2019s numerous language versions and sister projects decides for itself how it wants to engage with bots.<\/p>\n<p>To be granted the above privileges, however, the bots have to be approved by the community. A Bot Approvals Group decides which automated programs may run on the English-language Wikipedia. The group has put together six requirements that a bot must meet to be approved.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-904\" src=\"https:\/\/www.reframetech.de\/en\/wp-content\/uploads\/sites\/23\/2017\/12\/Bot_Gesetz_ENGL_1.jpg\" alt=\"\" width=\"780\" height=\"373\" srcset=\"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2017\/12\/Bot_Gesetz_ENGL_1.jpg 780w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2017\/12\/Bot_Gesetz_ENGL_1-768x367.jpg 768w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2017\/12\/Bot_Gesetz_ENGL_1-600x287.jpg 600w\" sizes=\"auto, (max-width: 780px) 100vw, 780px\" \/><\/p>\n<p>These criteria are the result of a long learning phase, since Wikipedians had the idea to at least partly automate, what is in\u00a0\u00a0 effect the repository of the world\u2019s knowledge, for quite a while. Yet while trying to do so they repeatedly encountered problems that made it necessary to reconsider the existing rules.<\/p>\n<p><strong>Small mistakes with a big impact<\/strong><\/p>\n<p><strong><a href=\"http:\/\/firstmonday.org\/ojs\/index.php\/fm\/article\/view\/6027\/5189\" target=\"_blank\" aria-label=\"Opens in a new tab\" >The first example was the \u201crambot\u201d.<\/a><\/strong> Beginning in October 2002, it used data made available online by the US government to create articles about cities and towns not yet included in Wikipedia. The rudimentary entries created by rambot only contained basic information, such as location and population, which had been inserted into a simple template. On the one hand, the experiment was highly successful: rambot created over 30,000 new articles at a time when the English Wikipedia only had 50,000.<\/p>\n<iframe id=\"datawrapper-chart-HdJp9\" src=\"\/\/datawrapper.dwcdn.net\/HdJp9\/1\/\" scrolling=\"no\" frameborder=\"0\" allowtransparency=\"true\" allowfullscreen=\"allowfullscreen\" webkitallowfullscreen=\"webkitallowfullscreen\" mozallowfullscreen=\"mozallowfullscreen\" oallowfullscreen=\"oallowfullscreen\" msallowfullscreen=\"msallowfullscreen\" style=\"width: 0; min-width: 100% !important;\" height=\"586\"><\/iframe><script type=\"text\/javascript\">if(\"undefined\"==typeof window.datawrapper)window.datawrapper={};window.datawrapper[\"HdJp9\"]={},window.datawrapper[\"HdJp9\"].embedDeltas={\"100\":721,\"200\":667,\"300\":640,\"400\":613,\"500\":613,\"600\":586,\"700\":586,\"800\":586,\"900\":586,\"1000\":586},window.datawrapper[\"HdJp9\"].iframe=document.getElementById(\"datawrapper-chart-HdJp9\"),window.datawrapper[\"HdJp9\"].iframe.style.height=window.datawrapper[\"HdJp9\"].embedDeltas[Math.min(1e3,Math.max(100*Math.floor(window.datawrapper[\"HdJp9\"].iframe.offsetWidth\/100),100))]+\"px\",window.addEventListener(\"message\",function(a){if(\"undefined\"!=typeof a.data[\"datawrapper-height\"])for(var b in a.data[\"datawrapper-height\"])if(\"HdJp9\"==b)window.datawrapper[\"HdJp9\"].iframe.style.height=a.data[\"datawrapper-height\"][b]+\"px\"});<\/script>\n<h6>In the study &#8220;<a href=\"http:\/\/www.iisi.de\/fileadmin\/IISI\/upload\/C_T\/2013\/ct2013_proceedings_S3-1_Mueller_Dobusch_Herbsleb.pdf\" target=\"_blank\" aria-label=\"Opens in a new tab\" >The Emergence of Algorithmic Governance in Wikipedia<\/a>&#8221; the authors determined, how much the number of bot-edits in the German-language Wikipedia has increased over the past years.<\/h6>\n<p>On the other, the automated effort quickly produced a number of problems. Errors in the data accessed by the bot resulted in more than 2,000 defect articles, causing even more work for the site\u2019s human authors. At times rambot overtaxed the young encyclopedia\u2019s resources. For example, users looking at Recent Changes \u2013 an automated log of the latest edits \u2013 had a hard time understanding what was happening on the platform since the list was inundated with entries stemming from the bot\u2019s work.<\/p>\n<p>As a result, the English Wikipedia community wanted to stop using bots completely, at least temporarily. Yet given Wikipedia\u2019s rapidly growing significance and complexity, they ultimately decided to deploy bots on the site \u2013 albeit only those that met strict requirements.<\/p>\n<p><strong>Minor details become full-blown controversies<\/strong><\/p>\n<p>Sometimes the conflicts between humans and machines are not technical, but social. This can be seen in an incident described by ethnographer R. Stuart Geiger.<\/p>\n<p>In 2006, a Wikipedia author using the pseudonym \u201cHagerman\u201d developed a bot to fix what he believed was a simple defect: Wikipedia does not have a traditional discussion forum. Instead, users have to insert their comments at the appropriate spot on discussion pages and use the characters \u201c&#8211;~~~~\u201d to sign them. That makes it clear who contributed and when. As Hagerman noticed, however, many entries lacked a signature, with the result that already confusing discussions were even harder to follow.<\/p>\n<p>Hagerman decided to program a bot to insert missing signatures and got approval to do so from the newly formed Bot Approvals Group. Although the program worked as intended and was altered whenever a problem arose, HagermanBot proved highly controversial. Some people saw it as an unseemly infringement of user rights that a bot was basically telling them how to sign their own contributions to the discussion. Hagerman then programmed the bot to offer an opt-out feature, allowing users to deactivate the automatic signature for any comments they made.<\/p>\n<iframe id=\"datawrapper-chart-WKDwW\" src=\"\/\/datawrapper.dwcdn.net\/WKDwW\/1\/\" scrolling=\"no\" frameborder=\"0\" allowtransparency=\"true\" allowfullscreen=\"allowfullscreen\" webkitallowfullscreen=\"webkitallowfullscreen\" mozallowfullscreen=\"mozallowfullscreen\" oallowfullscreen=\"oallowfullscreen\" msallowfullscreen=\"msallowfullscreen\" style=\"width: 0; min-width: 100% !important;\" height=\"565\"><\/iframe><script type=\"text\/javascript\">if(\"undefined\"==typeof window.datawrapper)window.datawrapper={};window.datawrapper[\"WKDwW\"]={},window.datawrapper[\"WKDwW\"].embedDeltas={\"100\":866,\"200\":663,\"300\":609,\"400\":565,\"500\":565,\"600\":565,\"700\":565,\"800\":538,\"900\":538,\"1000\":538},window.datawrapper[\"WKDwW\"].iframe=document.getElementById(\"datawrapper-chart-WKDwW\"),window.datawrapper[\"WKDwW\"].iframe.style.height=window.datawrapper[\"WKDwW\"].embedDeltas[Math.min(1e3,Math.max(100*Math.floor(window.datawrapper[\"WKDwW\"].iframe.offsetWidth\/100),100))]+\"px\",window.addEventListener(\"message\",function(a){if(\"undefined\"!=typeof a.data[\"datawrapper-height\"])for(var b in a.data[\"datawrapper-height\"])if(\"WKDwW\"==b)window.datawrapper[\"WKDwW\"].iframe.style.height=a.data[\"datawrapper-height\"][b]+\"px\"});<\/script>\n<h6>In the administrative part of the German-language Wikipedia an increasing share of contributions is made by bots. They are used for diverse purposes &#8212; to add signatures, send out newsletters, etc.<\/h6>\n<p><strong>Bots have a social effect<\/strong><\/p>\n<p>Yet the controversy took a more fundamental turn when users began asking if a bot should even be allowed to alter a comment made by a human. In light of Wikipedia\u2019s consensus-oriented decision-making system, additional concessions were made. Consequently, Hagerman developed a function that allowed users to disable HagermanBot for certain user and discussion pages, a function that other bot programmers ultimately adopted as well.<\/p>\n<p>It comes as no surprise to researchers that such conflicts can flare into major controversies. \u201cWikipedia is a socio-technical system,\u201d says Claudia M\u00fcller-Birn, a professor at Freie Universit\u00e4t Berlin who carries out research on human-computer collaboration, speaking with blogs.bertelsmann-stiftung.de\/algorithmenethik. That means the site\u2019s technical and social elements cannot simply be separated, but interact with each other on many levels.<\/p>\n<p>The Wikipedia bots work the way speed bumps do on a city street. Even if the traffic devices are only used to enforce existing speed limits, they change the nature of the street itself, for example by making it impossible for everyone, even for ambulances, to go faster than the legal limit. Social consensus is thus transformed into an inescapable rule.<\/p>\n<p><strong>Human authors scared off by bots<\/strong><\/p>\n<p>The use of bots and other software-supported tools has had a lasting influence on the Wikipedia community\u2019s social structure. <strong><a href=\"https:\/\/www-users.cs.umn.edu\/~halfak\/publications\/The_Rise_and_Decline\/\" target=\"_blank\" aria-label=\"Opens in a new tab\" >In one study<\/a><\/strong>, for example, Aaron Halfaker and R. Stuart Geiger found that uncompromising efforts to combat vandalism have measurably impacted the number of Wikipedia users.<\/p>\n<p>Based on their data, the researchers maintain that the percentage of good-faith newcomers editing a Wikipedia article for the first time has remained constant over the years. What has increased rapidly, however, is the share of \u201creverts,\u201d the editing changes that undo the contributions made by newcomers, often within seconds. The result is that many potentially valuable authors have been turned away at Wikipedia\u2019s front door \u2013 and possibly discouraged from participating for good.<\/p>\n<p>It is not only bots who are scaring off newcomers by criticizing and reverting their work. The German-language version of Wikipedia has also been struggling to retain editors, even though automated tools such as ClueBot NG are not allowed to autonomously delete contributions made by humans. Instead, the German community deploys software like Huggle, which accesses the same data used by bots in order to decide whether a new entry is vandalism. Ultimately, however, the decision to delete a contribution is not made by an algorithm, but by a human being \u2013 even if this hardly makes a difference to the author in question: As Geiger and Halfaker ascertained, Huggle users only reply to 7% of the queries they receive about reverts they have made.<\/p>\n<p>Having understood the far-reaching implications of using bots, many Wikipedia communities have become more cautious about the processes they use. \u201cAs a bot operator you have to take responsibility for your bot,\u201d says Wikipedia author Freddy2001, who also operates a bot. That is why she has included buttons on her bot user page that people can use to disable the tool. In other words, before the automated helper annoys a user it should just stop doing what it was programmed to do.<\/p>\n<p><strong>New openness stops user hemorrhage<\/strong><\/p>\n<p>Obviously this is not possible for every bot, since tools such as ClueBot NG are more or less indispensible when it comes to fighting vandalism, a constantly growing problem. In view of how important the online encyclopedia has now become, the Wikipedia communities must always strike a balance between making it easier for existing users to do their work and being open to new users.<\/p>\n<p>The Wikimedia Foundation has now launched a number of initiatives to make the project more attractive to newcomers. For instance, a new program, VisualEditor, makes it easier to create articles without first having to learn all of the site\u2019s complex formatting rules. There are also new mentoring programs and tools people can use to say \u201cthank you\u201d for constructive contributions. Although these measures have not returned the Wikipedia community to its former size, they do seem to have stopped the rapid loss of users.<\/p>\n<p><strong><em>With the launch in 2012 of Wikidata,\u00a0 Wikimedia\u2019s new platform for facts and figures, bots gained another area of application and became an even bigger presence on the site. To learn more, see Part 3 of our series. You can also subscribe to our <\/em><\/strong><a href=\"https:\/\/www.reframetech.de\/feed\/\" target=\"_blank\" aria-label=\"Opens in a new tab\" ><strong><em>RSS<\/em><\/strong><\/a><strong><em> feed or <\/em><\/strong><a href=\"https:\/\/www.reframetech.de\/newsletter\/\" target=\"_blank\" aria-label=\"Opens in a new tab\" ><strong><em>e-mail newsletter<\/em><\/strong><\/a><strong><em> to find out when new posts appear in this blog. <\/em><\/strong><\/p>\n<p>&nbsp;<\/p>\n","excerpt":"<p>Bots take care of many routine tasks at Wikipedia. The automated assistants greatly influence how the community interacts at the [&hellip;]<\/p>\n","thumbnail":"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2017\/11\/Wikibots6.jpg","thumbnailsquare":"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2017\/11\/Wikibots6-370x370.jpg","authors":[{"id":938,"name":"Torsten Kleinz","link":"https:\/\/www.reframetech.de\/en\/blogger\/torsten-kleinz-2\/"}],"categories":[{"id":2,"name":"Uncategorized","link":"https:\/\/www.reframetech.de\/en\/category\/uncategorized\/"}],"tags":[{"id":76,"name":"Bots","link":"https:\/\/www.reframetech.de\/en\/tag\/bots-en\/"},{"id":78,"name":"Wikipedia","link":"https:\/\/www.reframetech.de\/en\/tag\/wikipedia-en\/"}]}