{"id":5289,"title":"AlgoRail in Spain: How algorithmic forecasting can help decrease gender violence","link":"https:\/\/www.reframetech.de\/en\/2020\/08\/12\/algorail-in-spain-how-algorithmic-forecasting-can-help-decrease-gender-violence\/","date":"08\/12\/2020","date_unix":1597214715,"date_modified_unix":1649923817,"date_iso":"2020-08-12T06:45:15+00:00","content":"<p><em>The use of VioG\u00e9n that assesses the risk of gender violence in Spain since 2007 has shown that a well-staffed and specifically educated police is vital for its success. <\/em><em>At the sixth stop of our\u00a0AlgoRail\u00a0through Europe, Michele Catanzaro reports how the algorithm has been continuously refined to help Spanish women and children stop suffering.<\/em><\/p>\n<p>In the early morning of 24 February 2018,\u00a0a Spanish psychologist went to a police station to report threats from her husband. According to her statement her husband had broken the buggy of their smaller child and slapped the older one.\u00a0After asking the woman a set of questions and feeding the answers to\u00a0VioG\u00e9n, a software that helps the Spanish police estimate the risk of recidivism in gender violence, the officer issued a report in which the risk was deemed as low.<\/p>\n<p><strong>Critical failure<\/strong><\/p>\n<p>A judge denied her request that her husband be forbidden to visit their children, based also on the low risk estimation made by the police. Seven months later, the husband killed their kids \u201cwith cruelty\u201d and threw himself out of a window.\u00a0The shocking story left people wondering why the case was deemed as low risk?\u00a0VioG\u00e9n\u00a0had missed its goal of supporting police personnel in assessing the risk of new assaults, and so assigning the right level of protection. Since the software was first deployed in 2007, there has been a series of \u201clow risk\u201d cases that have ended in homicide of women or children.<\/p>\n<p><strong>Better than nothing<\/strong><\/p>\n<p>Still, the program is by far the most complex of its sort in the world and has reasonable performance indexes. Nobody believes that things would be better without it. But critics point out some flaws. Few police personnel are educated in gender-based violence. Moreover, the program may have systematically underestimated risk. Some victims\u2019 organizations believe that the possibility of a low risk score is nonsense. Reporting to the police is a\u00a0high-risk\u00a0situation in itself, they say, because abusers perceive it as a challenge.<\/p>\n<p><strong>Reporting an assault<\/strong><\/p>\n<p>When a woman goes to report an assault from an intimate partner, she triggers a process, in which first, the police agent goes through an online form with her. Questions explore the severity of previous assaults, the features of the aggressor, the vulnerability of the victim and aggravating factors. Answers are thrown into a mathematical formula that computes a score, measuring the risk that the aggressor repeats violent actions. Whilst it is known that the algorithm gives\u00a0more weight to items that empirical studies have shown to be more related with recidivism, the exact formula is not being disclosed.<\/p>\n<p><strong>Keeping the score<\/strong><\/p>\n<p>In theory, Spanish agents can increase the score by hand, if they appreciate a higher risk. But a 2014 study found that they stuck to the automatic outcome in 95% of the cases. Once a case\u2019s score is established, the agent decides on a packet of protection measures associated to that level of\u00a0rispredictivek.\u00a0The police meet again with the woman to fill in a second form, in order to assess whether the situation has worsened or improved. This happens periodically,\u00a0more or less frequently\u00a0depending on the risk level. Police stops following up only if judicial measures are not pursued and the risk level falls below medium.<\/p>\n<p><strong>The best available system<\/strong><\/p>\n<p>VioG\u00e9n\u00a0is the best device available to protect women\u2019s lives, according to\u00a0\u00c1ngeles\u00a0Carmona, president of the Domestic and Gender-Based Violence Observatory of the Spanish General Council of the Judiciary.\u00a0She recalls a case she saw in a court in Seville, of an aggressor that had a high-risk of recidivism, according to\u00a0VioG\u00e9n. A control wristband was applied to the man. One day, the police saw that the signal of the wristband was moving fast towards the victim\u2019s home. They broke into it just in time to prevent him from suffocating her with a pillow.<\/p>\n<p>It\u2019s impossible to know how many lives have been saved thanks to\u00a0VioG\u00e9n. A widely-used measure of performance for predictive models is called\u202f<a href=\"https:\/\/en.wikipedia.org\/wiki\/Receiver_operating_characteristic#Area_under_the_curve\" target=\"_blank\" aria-label=\"Opens in a new tab\"  target=\"_blank\" rel=\"noopener noreferrer\"><strong>Area Under the Curve<\/strong><\/a>\u202f(AUC) and a 2017 study that tried to measure how good the system was, calculated values between 0.658 and 0.8. An AUC of 0.5 is as good as a coin\u2019s toss and an AUC of 1 means the model never fails. In other words,\u00a0VioG\u00e9n\u00a0works.\u00a0Comparing VioG\u00e9n with other instruments assessing the risk of intimate partner violence one can conclude that it is among the best things available.<\/p>\n<h3><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-5299\" src=\"https:\/\/www.reframetech.de\/en\/wp-content\/uploads\/sites\/23\/2020\/08\/Vorlage_TwitterKacheln_AlgoTrail_Spain_EN.jpg\" alt=\"\" width=\"1280\" height=\"720\" srcset=\"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2020\/08\/Vorlage_TwitterKacheln_AlgoTrail_Spain_EN.jpg 1280w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2020\/08\/Vorlage_TwitterKacheln_AlgoTrail_Spain_EN-768x432.jpg 768w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2020\/08\/Vorlage_TwitterKacheln_AlgoTrail_Spain_EN-600x338.jpg 600w, https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2020\/08\/Vorlage_TwitterKacheln_AlgoTrail_Spain_EN-780x439.jpg 780w\" sizes=\"auto, (max-width: 1280px) 100vw, 1280px\" \/><\/h3>\n<p><strong>Ignored requirements<\/strong><\/p>\n<p>In 2017, there was a total of 654 agents in all Spain belonging to the Women-Children Teams of the Guardia Civil, much less than one for every police station.\u00a0This is very different from what the 2004 law that created VioG\u00e9n required. According to it, cases should be dealt with by an interdisciplinary team including psychologists, social workers, and forensic doctors. Several teams were created after the law was passed in 2004, but the process was cut sharply by the austerity following the 2008 financial crisis.<\/p>\n<p><strong>VioG\u00e9n\u00a05.0<\/strong><\/p>\n<p>A new protocol was put in place in March 2019, the fifth big change VioG\u00e9n has gone through since its first deployment in 2007. Now, the program identifies cases \u201cof special relevance\u201d, in which the danger is high, and cases \u201cwith minors at risk\u201d. It is also not being disclosed how the new scale was built, but it was based on a four-year study to find which factors were specifically related to cases that end up in homicides. The new protocol seems to have triggered a major shift in the risk scores of\u00a0VioG\u00e9n: The number of extreme risk cases rose and those of high risk almost doubled.<\/p>\n<p><em>That\u2019s it for this sixth stop of our AlgoRail through Europe, on which we want to learn more about how algorithmic systems are used in our European neighborhood. Next week we will cross the country and continue to <a href=\"https:\/\/www.reframetech.de\/en\/2020\/08\/19\/algorail-digitalization-helps-reduce-prescription-fraud-in-portugal\/\" target=\"_blank\" rel=\"noopener noreferrer\"><strong>Portugal<\/strong><\/a>.<\/em><\/p>\n<hr \/>\n<p><em>This story was shortened by\u00a0<\/em><a href=\"https:\/\/www.reframetech.de\/en\/blogger\/julia-gundlach\/\" target=\"_blank\" rel=\"noopener noreferrer\"><strong><em>Julia Gundlach<\/em><\/strong><\/a><em>. The unabridged story was\u00a0<\/em><a href=\"https:\/\/algorithmwatch.org\/en\/story\/viogen-algorithm-gender-violence\/\" target=\"_blank\" aria-label=\"Opens in a new tab\"  target=\"_blank\" rel=\"noopener noreferrer\"><strong><em>published<\/em><\/strong><\/a><em>\u00a0on the AlgorithmWatch website.<\/em><\/p>\n<p><em>The blog series AlgoRail is part of the Automating Society Report 2020 by Bertelsmann Stiftung and AlgorithmWatch, which will be published this fall and is coordinated by\u00a0<\/em><a href=\"https:\/\/www.reframetech.de\/en\/blogger\/dr-sarah-fischer\/\" target=\"_blank\" rel=\"noopener noreferrer\"><strong><em>Dr. Sarah Fischer<\/em><\/strong><\/a><em>. In addition to journalistic stories like this one, the report gives an overview of various examples of algorithmic systems as well as current debates, policy responses and key players in 15 countries.\u00a0<\/em><a href=\"https:\/\/www.bertelsmann-stiftung.de\/de\/publikationen\/publikation\/did\/automating-society\" target=\"_blank\" aria-label=\"Opens in a new tab\"  target=\"_blank\" rel=\"noopener noreferrer\"><strong><em>A first issue of the report<\/em><\/strong><\/a><em>\u00a0was published in January 2019.<\/em><\/p>\n<hr \/>\n<p>This text is licensed under a\u00a0<a href=\"http:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\" aria-label=\"Opens in a new tab\"  target=\"_blank\" rel=\"noopener noreferrer\"><strong>C<\/strong><\/a><a href=\"http:\/\/creativecommons.org\/licenses\/by\/4.0\/\" target=\"_blank\" aria-label=\"Opens in a new tab\"  target=\"_blank\" rel=\"noopener noreferrer\"><strong>reative Commons Attribution 4.0 International License<\/strong><\/a><\/p>\n","excerpt":"<p>The use of VioG\u00e9n that assesses the risk of gender violence in Spain since 2007 has shown that a well-staffed [&hellip;]<\/p>\n","thumbnail":"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2020\/08\/33708709904_70dfee243e_k-5-1-780x373.jpg","thumbnailsquare":"https:\/\/www.reframetech.de\/wp-content\/uploads\/sites\/23\/2020\/08\/33708709904_70dfee243e_k-5-1-370x370.jpg","authors":[{"id":5306,"name":"Michele Catanzaro","link":"https:\/\/www.reframetech.de\/en\/blogger\/michele-catanzaro\/"}],"categories":[{"id":2,"name":"Uncategorized","link":"https:\/\/www.reframetech.de\/en\/category\/uncategorized\/"}],"tags":[{"id":638,"name":"AlgoRail","link":"https:\/\/www.reframetech.de\/en\/tag\/algorail-en\/"},{"id":263,"name":"Algorithm","link":"https:\/\/www.reframetech.de\/en\/tag\/algorithm\/"},{"id":538,"name":"predictive policing","link":"https:\/\/www.reframetech.de\/en\/tag\/predictive-policing-en\/"}]}