
Research by the charity Statewatch has found that the UK’s Ministry of Justice is developing a ‘murder prediction’ tool – using individuals’ data that is funnelled into algorithms to create a risk assessment. Understandably this has raised concerns.
Statewatch published a report on March 31 that said: ‘The Homicide Prediction Project uses police and government data to profile people with the aim of ‘predicting’ who is ‘at risk’ of committing murder in future. It is a collaboration between the Ministry of Justice (MoJ), the Home Office’ and the police.
The tool, the organisation claimed, ‘uses data from the MoJ, [and] the Police National Computer …. to attempt to make these ‘predictions’’.
Statewatch added that: ‘The MoJ says in the documents [that it uncovered via Freedom of Information requests] that the project is intended to ‘[e]xplore the power of MoJ datasets in relation to assessment of homicide risk’. The MoJ Data Science team then ‘develops models’ seeking ‘the powerful predictors in the data for homicide risk’. The documents refer to the ‘future operationalisation’ of the system.’
This project is part of the MoJ’s ‘Data Science Hub’, which seeks to use ‘data and information to make excellent decisions using cutting edge tools, techniques and collaboration and putting evidence at the heart of the justice system’, they added.
And here is a link to the original story, and here is a link to the Guardian piece that is based on the charity’s work.
—
So, should we be worried?
Well, first let’s look at algorithms and data when it comes to justice. We have been here before. Algorithms, i.e. mathematical formulae that seek to model likely outcomes, have been used before to assess whether a person is a risk and needs to be kept on remand. In the US we have seen the now famous COMPAS system, that is used to ‘assess the likelihood of a defendant becoming a recidivist’.
And of course, judges and parole teams around the world use ‘guidance’ to assess what to do next with a defendant or convicted criminal. In some ways these are algorithmic as well. Plus, credit agencies, insurance companies and banks also use algorithms and personal data to make assessments about what will happen with people in the future.
Plus, we have seen for more than a decade now efforts toward ‘predictive policing’, where resources are moved to likely flash points before things kick-off.
But, is predicting murder a step too far? What happens to someone if they have this ‘death mark’ on their record, sitting in a database somewhere, even though they may have never committed such a crime before?
Moreover, any fixed formula for calculation embeds biases – because there will always be biases in any system. One good reason as to why we should use humans to make key judgments is that they can evolve their views very quickly and adapt to new ideas and insights. Fixed systems can be a very blunt instrument that don’t really reflect the current – or in fact any – actual reality.
Also, it has to be said, this is not using some amazingly advanced tech. This is not using telepaths and people with foresight either, as in the movie ‘Minority Report’. It’s much more mundane. It’s data collected on an individual and then those numbers are weighted and pushed through a ‘mechanism’ that then comes out with another number. If that number is high, then you get marked down as a risk. If not, then you don’t. Just the same as getting a credit card or not.
But, is that how crime works? While it’s probably true that ‘career criminals’ can be predicted to keep transgressing, do all murders happen as part of a long and slow trajectory that builds up from not paying a traffic ticket, to arguing with a traffic warden, and then up and up the scale to eventually committing the most heinous of crimes?
This site doesn’t claim to be an expert on criminology, but the newspapers are full of stories about people who seem ‘normal’ to everyone around them, then ‘go mad’. Moreover, what about those who have done wrong, but are trying to ‘sort their lives out’?
Finally, do you even want ‘the State’ making decisions about people like this, i.e. deciding ‘before the fact’ that they are potentially the worst of all offenders? Is that the beginning of a slippery slope toward governments making pre-emptive prosecutions, (as explored by the movie Minority Report)? Will the Old Bill pop around to people’s houses and say: ‘Excuse me, but we were wondering if you plan on killing someone this weekend? You see, we have you in this algorithm and it’s looking a bit iffy.’
No doubt the debate will continue. But, either way, it’s important that the debate is held in public.
—
What do you think?